The Thing About Jetpacks
by Jon Bell & Lukas Mathis

Published August 31, 2013
by lot23 productions

ISBN: 978-0-615-85966-8

All errata pertaining to this edition are listed here:
thethingaboutjetpacks.com/errata/

Licensed under the Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. You are allowed to quote, send, translate, and reuse as much of this as you’d like as long as you kindly respect these two wishes:

First, you must give us, Jon Bell and Lukas Mathis, credit. Second, you cannot make money off our work. Please email either of us with any questions. Our email is helpfully listed a few lines away.

See? Right here:
jon@lot23.com
lukas.mathis@gmail.com

Twitter:
@workjon
@lkm

Websites:
lot23.com
ignorethecode.net

Preamble

I'd like to talk to you about three things.

First, thing number one. Almost exactly a year ago, I visited Seattle. While there, I met my Internet friend Jon Bell, whom you might know from such things as this book you're currently holding. It's funny how you can meet somebody for the first time, but feel like you know at least part of that person very well. We talked about design a lot, and when I left, I knew two things: one, I like this guy. Two, it would be fun working with him on some kind of project.

Here's thing number two: I'm a horrible procrastinator. One of the techniques I use to cheat myself into doing things is something I call "minimum viable goals". I tend to set huge goals for myself. Stuff like "tomorrow, I want to clean the house, finish two projects I'm currently working on, and vacuum the car." When I wake up the next day, all of this work seems so daunting that I can't bring myself to start it. Instead, I just sit on the balcony and read a book. Which isn't bad, of course. But it's not what I intended to do.

So instead of setting these huge, insurmountable goals that are so big that I don't even know where to start, I stop myself, and instead find a minimum viable goal. Instead of writing five chapters in a book, I'm merely going to rewrite a single sentence that has been bugging me. Instead of cleaning the whole house, I'm merely bringing out the trash. Instead of finishing that website I've been working on, I'm merely going to improve the validation error messages on the contact form.

Those are small goals. I can get one of them done easily, and still go read afterwards. So I get one of them done. And then, instead of reading a book, I do another one, because I'm already working on it. And another one. And then I start working on something a bit bigger. And since I've already started, I continue doing it for a bit, because I don't want to leave it unfinished. And before I know it, I've done a lot of small things. And some bigger things. Everything considered, I've ended up doing quite a huge thing.

This is how it often works. Starting with small ambitions is easier than starting with huge ones, but you'll often end up achieving more that way. Okay, I'm bad at foreshadowing, so I'll just come right out and say it: Jon writes about this very same concept in his essay "50 Words", which you'll find later in this book. Don't skip ahead, though, I want to finish telling you about my three things first.

On to thing number three: writing on the Internet can be, well, a sad experience, especially if you have a reasonably sized audience. If tens of thousands of people read what you write, there's always a few people who find some tiny detail in your writing that just happens to be one of their pet peeves. So every time I publish something, I get a few emails complaining about things. Some of these complaints are well-reasoned and interesting and useful. Others, less so.

To avoid the less-so ones, I started writing more and more defensively. I started hunting for any potentially offensive or even just not entirely bland sections in my essays, and removing them, because I knew people would complain. But that's no fun. Writing publicly started to become more and more of a chore than something I was looking forward to.

When Jon asked me whether I wanted to write some essays for a tiny booklet on design that would only be sent to 100 people (no pressure, he assured me), I said yes, because the three things I mentioned above — I like Jon, and I like to do small things, and I like the idea of limiting the audience of my writing — came together in a perfect blend of... multi-flavored ice cream, I guess.

And like in my "minimum viable goal" thing, all of the small things we've done together have ended up creating a big thing, which you're holding in your hands now. We wrote a lot of small essays, and now they're a big book. Which is awesome.

Lukas Mathis
Switzerland
July 30, 2013

Introduction

It was a fun year.

There are plenty of websites that get a following, and plenty of websites that stop updating after a month. Fuckjetpacks.com managed to do both.

The site launched on August 31st with a tagline: "celebrating the future around us". That, along with the notable domain name and this essay named "Why" helped explain what I was hoping to achieve:

I built this site because I believe the soul of a product can be found in interaction design. Put another way, design is not just how it looks, it's also how it works.

Do you remember the feeling you got the first time you used the iPod click wheel? Or the first time you saw Metro? What about the feeling you have after navigating a phone tree for five minutes before being told to call back during normal business hours? Or when your phone battery dies at 4pm? These are the moments where a product succeeds or fails.

But those moments can be difficult to describe in words, or are too subjective, so they don't get as much publicity. Instead we get facts, which are boring: I can tell you how many gigs a phone has, or the color of a car, or the dollar amount of the latest patent case settlement. But describing how something feels is much harder. More personal. More interesting.

And if I do my job right, it'll be a lot more fun to read.

As time went on, I got more convicted about the points in this mini-festo. There really are a lot of sites on the internet that emphasize visuals. Just about any design or tech-oriented site is going to lean towards beautiful renders, discussions of new logos, shaky videos of unboxing videos. Things that are all about the celebration of what you can see, what you can quantify.

But I'm not primarily a visual designer, I'm an interaction designer and a developer. I want to make things. I want to craft whole experiences, so I don't relate to sites that spend more time on the tech specs than the feel of the product.

And besides that, I'm a consumer. The cover of a book, the poster for a movie, or stills from a video game don't tell me anything about an experience. That's where the good stuff is, and where the best designers are able to make their mark. I wanted to stop fantasizing about jetpacks and start celebrating the little details that are making today's products delightful and worthy of praise.

So I launched the site with four little design essays and almost immediately the site found an audience. A pretty big one. After a month I decided to pull it offline again, and instead started writing a trilogy of books with Lukas Mathis called For 100 Of Our Closest Friends.

As the title suggests, we only gave out 100 copies of each volume, which was our way of getting away from the publicity of the internet and back towards something we hoped would be a little more special, a little more traditional, and a whole lot quieter.

This book is a compilation of everything we published over the year. We hope you enjoy it :)

Jon Bell
Seattle
August 12, 2013

The Benefits of Isolation

I was pleased to discover an essay entitled Solitude and Leadership.

As long as there's been a web, there have been people decrying it. They point to studies that tie it to depression, they cite its isolating effects, and they remind us that nothing is a substitute for face-to-face interaction.

And they're not entirely wrong. But they're only representing one side of the story. The web, it should go without saying, has brought tremendous value to our lives alongside a lot of garbage. It's not all revolutions and time wasted, of course. It's also everything in between.

And so this article is particularly poignant to me, as it doesn't seek to blame the web, or celebrate it. It moves beyond the standard debate entirely and lays its focus on us. Our attention span. Our ability to think. Deresiewicz writes:

I find for myself that my first thought is never my best thought. My first thought is always someone else’s; it’s always what I’ve already heard about the subject, always the conventional wisdom. It’s only by concentrating, sticking to the question, being patient, letting all the parts of my mind come into play, that I arrive at an original idea.

By giving my brain a chance to make associations, draw connections, take me by surprise. And often even that idea doesn’t turn out to be very good. I need time to think about it, too, to make mistakes and recognize them, to make false starts and correct them, to outlast my impulses, to defeat my desire to declare the job done and move on to the next thing.

There's plenty of ammunition in the essay for technophobes, but I don't read this as an anti-technology screed. I read this as a critique about how we learn, create, and grow. An analysis of the time we don't give ourselves.

Later, he writes:

Thinking for yourself means finding yourself, finding your own reality. Here’s the other problem with Facebook and Twitter and even The New York Times. When you expose yourself to those things, especially in the constant way that people do now—older people as well as younger people—you are continuously bombarding yourself with a stream of other people’s thoughts.

You are marinating yourself in the conventional wisdom. In other people’s reality: for others, not for yourself. You are creating a cacophony in which it is impossible to hear your own voice, whether it’s yourself you’re thinking about or anything else.

I think this is a good summary of why I left the web for a while, and why I'm back. As I get more and more dissatisfied with conventional wisdom, with soundbites, with tech tabloids, with regurgitated content, I think there's a greater and greater need for real voices to return to the web.

So here I am, doing my small part.

Further reading:

Solitude and Leadership, by William Deresiewicz: theamericanscholar.org/solitude-and-leadership

Simple.com Has One of the Best Experiences on the Internet

[This post was heavy on images, so it loses a bit in translation to a book. The site and services has only gotten more impressive. I recommend checking it out. –Ed.]

I like my current bank, but I'm moving to Simple.com. It's a textbook case of strong but understated interaction design. Let's dive in!

Passphrases

Banks deal with money, meaning they need to be secure, meaning the bank website should have an awful policy requiring an uppercase letter, a number, a symbol, and changing your password every two weeks, right?

Wrong. Say hello to "passphrases". They're more secure (because they're longer) and easier to remember because they're plain English.

Here are some examples of passphrases:

Passphrases are awesome, and Simple is awesome for using them.

Sort by Size

Most financial sites sort your financial activity so that the deposits are at one end of the list, and the biggest charge is on the other end.

Simple understands how people really look at money - they want to see the deposits and the largest credits grouped together. Brilliant.

Nifty Auto-tags

If I tip someone, Bank Simple knows about it. If I pay a fee, they know that too. In both cases, automatic tags appear next to the line item, which is a huge leap forward, interaction-wise.

This is on top of their always-right auto-categorization of charges. I don't know how they made it so good, but they did and it means a lot of saved time.

Super simple saved searches

Want to know how much you spend at fancy restaurants? Do a search for "restaurants > 30". It's that easy.

Now let's say you want to track your spending in a certain area. Easy. Click the big "New Saved Search" button and you can quickly see that data whenever you need it.

Per month, per week, per day views

Knowing how much you're spending overall, or in a category is nice. But you know what's even better? Seeing how much you're spending per week. It's a completely new way to think about budgets, long overdue, and executed perfectly.

"Safe to Spend"

It's a bit annoying when there's a difference between your account balance and what you can spend. Simple just highlights the money you can spend after subtracting any savings goals you may have.

The best customer support I've ever received

Simple puts customer support front and center, to the extent that it's a pane that slides onto the website from anywhere you are. In my experience, responses come in within 5 minutes, and never longer than a day. It's so good that sometimes I drop into the customer support box just to tell them how much I love their service. And then they write me back within minutes to say thanks.

It's so easy to send money

Click the name, choose a dollar amount, whoosh, a check is sent. There's also an iOS app. Bill pay has never been this delightful.

And so

There's more, but I don't want to give everything away.

Simple is part of the reason I kicked off this blog. A lot of people talk about design like it's magic pixie dust that you sprinkle on a product to succeed in the marketplace.

But it's obvious when a company only understands design at a superficial level. The masthead may be nice, but the actual experience tends to fall apart.

Simple has no such problem. Sure, it looks nice, but more importantly, it works right. It's actually delightful, despite being a banking site.

Take notes. This is how it's done.

"Flick to TV" Represents Everything Wrong With Product Design Today

Let's say you're designing some way to send video from one device to another device (for example, from your phone to your TV), and you think "ah ha! instead of pressing a boring button, what if we made it so you could just flick the video off your phone's screen up to the tv?"

I admit, it sounds neat at first. But it's classic Jetpack Design, putting "wow" before "it just works". In real world usage, the interaction falls apart.

What happens when you have two outputs, a TV and a stereo? Well, you need some way to select which output you're aiming at.

And how do you make the flick gesture discoverable enough for the mainstream? Cutesy gestures are the mystery meat navigation of the touch era.

This particular feature may actually suffer from the opposite problem, by being too discoverable. If you want to reserve "swipe up" to mean "flick to another screen", how does a person scroll up on a website without accidentally triggering the "flick to tv" gesture?

You could make it a two step process — tap the share button, then flick in the direction of the screen, then hope there aren't two screens near each other that would require a "which screen did you mean?" dialog.

Or you could remove the gesture, leaving you with a button. A single, understandable, discoverable button that can work for multiple outputs. It's not sexy and new, but it works. And we need more "it just works" design in the world.

Apple Avoids the Temptation of Jetpack Design

Since the late 90's, most of Apple's product releases have been met with a yawn by industry watchers even as the products themselves sell better year-over-year.

This is not a coincidence.

A lot of Apple's success comes from avoiding the temptation of Jetpack Design. Here's how they do it:

Pick One Feature

Apple has mastered the art of saying "we did hundreds of things in this release, and here are the highlights" while reserving their massive marketing firepower for a single feature.

Don't Be Too Early

The Newton was a double-edged sword for Apple. On one hand, it had a big "wow" factor and reminded the world of Apple's innovative DNA. On the other hand, it was expensive and Apple had to spend considerable time and energy explaining why a "portable digital assistant" was necessary. It failed.

Now Apple waits for markets to mature a bit before they enter. They've de-emphasized "first" in favor of "best". Facetime is just video chat. Retina Displays are just higher resolution. Siri is just voice recognition. But in all three cases, they grabbed a tremendous amount of mindshare in a short time.

Skip First, Aim for Best

By picking a single thing to trumpet, and by making sure the market is ready for it, Apple just needs to focus on a great execution of their highlighted feature.

Apple has always been upfront about this facet of their strategy. Tim Cook recently said "[Apple will] participate only in markets where we can make a significant contribution."

Tell the World

So you've got a new spin on an idea that the market has seen before. You've executed on it well. You don't have a ton of other features cluttering your marketing message.

Now to go market. Go big. Let everyone know. Blanket the airwaves with a single, powerful, well-honed message that no one can miss.

(Many companies only do this step, and forget about making a product that anyone will care about. This is not recommended.)

Watch the Yawns Roll In

We're conditioned to think that more features are better. That "innovation" means "no one has ever seen this idea before". That new ideas always win in the marketplace.

As product designers, we could learn a thing or two from the way Apple ships "boring", "passé", "me-too" features once a year, like clockwork, and "makes them look pretty".

I'm with Gruber on this one: Apple may be on to something.

Further reading:

How Apple Rolls:
macworld.com/article/1151235/apple_rolls.html

I Recently Saw a Delicious Interaction That Most Restaurants Could Learn From

Imagine a restaurant website that puts special emphasis on whether or not the restaurant is currently open. Instead of describing the open and close times with text, the website checks the current time and speaks to you naturally.

For example: "We're open!" or "Come quick, we're closing in an hour!" or "Sorry, we just closed. We're open until 10pm on weekdays, and midnight on weekends. Hope to see you soon!"

This is when it becomes clear that the website isn't just a PSD blindly exported to the web, as so many are. The designer clearly thought about the three most important questions that people ask a restaurant website, which are:

... and the design employs a tiny bit of software magic to address each of these primary use cases.

Take the menu - when you load the page, it slides down to the part of the menu that corresponds with the time of day. In the morning it takes you to breakfast items, in the evening you see dinner.

When you look up the restaurant's location, the site uses HTML5's location detection to show you the route between you and the restaurant instead of making you type it all in by hand.

This is why today's designers can benefit from learning a bit of code. Software design often needs to flex temporally and present different things based on each visitor's conditions.

And that's not something Photoshop or traditional visual design education is set up for. Experience design goes beyond PSDs.

They Mean Well

"Chase your dreams" is loving, well-intentioned, awful advice. It has all the empty promise of something we often hear in product design: "make it awesome".

If only it were that easy.

In both cases, people mean well. They're just trying to help inspire you. And that's nice, to a point.

Thing is, most of us don't need platitudes. We need help taking the first step. Many of us know where we want to go, but we're feeling overwhelmed or lost.

The key, whether you're talking about product design, living a fulfilling life, dieting, saving money, starting your own business, finishing a term paper, becoming more organized, cleaning your house, repairing your relationship, or any other daunting task is this: start.

Stop chasing "awesomeness", close all your browser tabs, and figure out your first step. Make it as small and doable as possible. Break it into steps until the first one feels manageable. Then do it.

Then figure out your next step. Do that. Repeat. Work hard. And one day, yes, you'll look around and see you've done something.

But you won't get there with idle dreaming. You'll get there because you stopped chasing, started doing, and kept it up. Your dreams have a way of finding you when you've been working hard.

Track 8 Is a Joy To Use

[For several years, Apple was criticized for their hyper-realistic design aesthetic while Android and Windows Phone embraced a flatter treatment that Microsoft popularized with what was then called “Metro”. The article below was in the midst of this debate, which is now largely rendered moot by iOS7’s new visual style. –Ed.]

There's a music player for iOS called Track 8. It's one of my favorite apps. And it's very, um, flat.

It's all the rage to talk about how Apple's gone off the skeuomorphic deep-end. And for some people, that's the end of the discussion. In their mind, Apple makes UI look too real-world, therefore their designs are lame.

It goes without saying, but it's a bit more complicated than that.

Let's be honest: Apple's iCal would be a pain to use even if Apple removed the stitched leather. It's just not a very strong calendar application, but the design community seems fixated on the skin Apple chose, which I consider to far less glaring of a problem. It’s just more obvious in a screenshot.

And on the other hand, I've used plenty of Metro apps that are a pain to use. Because at the end of the day, it's hard to nail an app's interaction model, whether done in Metro, something skeumorphic, or something in-between.

Which brings me back to Track 8. It's fantastic design, but not because it looks like a Metro app. It's just a fantastic experience, period. I highly recommend it.

What's Your Cup of Tea?

When I was a teenager, my friend and I drove by two people deep in conversation. I pointed at them and said "See, that's my cup of tea." I love his response: "It's so great that you have a cup of tea."

Thinking about how I work best has helped me with my design work. I spend a lot of time listening to my intuition and aligning with processes I know help me to be more creative. It's how I get to my best designs.

Put another way, my "design process" starts with "process design".

Which brings me to the process design I've used for this site. Early on, I had some pretty strong gut feelings about how FJP should work, and those decisions affect the experience of reading it.

First, I write best when it feels like no one is reading. I'm less cautious and more honest, so it's more likely that I'll say something that has some value.

Second, I write first drafts quickly, but it takes me multiple drafts to get to an acceptable level of quality. I have a bad habit of hitting publish too quickly.

Third, software slows me down and frustrates me. When I write in a CMS, my writing suffers. Writing uses a completely different part of the brain than blogging management requires.

And that's why the site works like it does. I'm anonymous so I can focus. I publish once a week so I give myself time to write multiple drafts. I hand-write the pages in plain text and put them through Markdown so I don't have to deal with software any more than necessary.

I'm inspired by the idea that I can lessen the distance between us by stripping everything away but the writing. I want to present you the best content I know how to write, once a week, with no overhead, and let the content speak for itself.

Your Sign-up Form Tells Me All I Need To Know

Some designers treat sign-up/sign-in forms as being outside the core experience for a web app. A few necessarily frustrating hoops before getting to the real fun of the site.

I disagree. Forms are the handshake, the joke to break the ice, the first impression. Your form tells me everything I need to know about your app, and by extension, your interaction design skills.

Each chef cooks roast chicken differently. Comedians all tell the Aristocrats joke differently. Musicians all put their own fingerprint on old jazz standards. Web designers have forms.

Like cooking, comedy, or jazz, it's less about the right or wrong answer and more about what you bring to the art form. Your years of experience, your level of expertise, the story you're trying to tell, your grasp of your user, all of it.

It's all there, in plain view, in the way you design your form.

One Of My Favorite Buttons

Hundreds of millions of people have gone through a flow like this dozens of times:

  1. Take a great photo
  2. Check a social app like Twitter, Facebook, SMS, or email
  3. Decide to share that great photo
  4. Tap "share"
  5. See "'Choose from library' or 'Take photo'" dialog
  6. Tap "Choose from library"
  7. Find most recent photo
  8. Tap most recent photo
  9. Tap "use" to use most recent photo

But despite it being a very common flow, despite the fact that the best designers behind smartphone platforms tend to sweat even minor details, despite the fact that it's not hard to see how to optimize for this scenario, it still took years before someone invented one of my favorite buttons: "Use Last Photo Taken".

So instead of having to dig through your photos or compose a new one from your camera, you just tap that button to get the most recent photo. Yes. It's so beautiful and clear. So obviously informed by a keen sense of empathy and a study of user behavior. It's perfect.

Of course, it isn't going to cure male pattern baldness, bring peace to the Middle East, or sell more copies of Tweetbot, the application that invented it. Frankly, I'd be surprised if more than 10% of their user-base ever stops to think about it.

But it's fantastic interaction design all the same. It's this attention to detail, this relentless drive to add just a little more delight or remove a little pain, this tendency to never be satisfied with the status quo that can lead to great things. It's just a button, but it's the result of wonderful thinking.

Opening Your Mind So Wide the Ghosts Slip In

When I'm designing, I believe in ghosts. Let me explain.

I'm an analytical person. I believe in science and logic. I don't actually believe in ghosts in any serious way.

But part of great design is taking lateral leaps of logic, of challenging assumptions, letting the world change your mind, staying receptive to new experiences and ways of thinking, channeling the energy and ideas around you, knowing anything is possible, letting your intuition drive your thinking, not saying no, not shutting things down, re-evaluating your point of view, treating everyone as if they have something to teach you, staying mentally agile, sharp, light, nimble, and quick.

And when I'm in that mode, when I'm truly in touch with my creativity, when my mind is necessarily wide open, the ghosts slip in. Of course ghosts might exist, just like of course this design problem has a solution just out of my reach, one I can discover as long as I keep working at it.

In that moment of creative inspiration, everything has to be possible. When I'm designing, I believe in ghosts. I have to.

"I Wish This App Looked the Same On All Platforms", Said No One, Ever

Every company that ships multi-platform software has gone through it. It's part of a predictable and natural maturation process, like how teens suffer through bad acne, or how college kids keep falling in love with Ayn Rand.

I'm referring to the infamous "Hey, since we ship on multiple platforms, why don't we make sure our software looks the same everywhere?" idea. It looks good on paper. But it just doesn't work, and never will, for several reasons.

Learning From Word 6.0

Word 6.0 is a fantastic cautionary tale, and it's such a simple story I can tell it in bulletpoints.

Expectations Aren't Portable

No one's arguing Word for Mac should have different features, a different name, significantly different branding, or compatibility issues. These are things the customer rightfully expect to work in an understandable and predictable way.

The problem is when you try to port whole interaction models from one platform to another. Mac OS, iOS, Windows, Windows Phone, Android, Blackberry, and web all have different conventions, interaction models, mental models, expectations, and de facto standards. Trying to pretend otherwise will only harm your design.

For example, if you put a "close window" button at the top left of a Windows application, you're not being "consistent". You're not "staying true to your brand". No, Windows users expect it on the top right, so you just designed and shipped a bad software experience. Period.

Speaking With An Accent

Users don't often express it verbally, but they struggle to work with software that speaks with an accent. They might not know what the word "port" means, but they feel it when software hasn't been tuned to their platform.

Watch users try to complete tasks. You'll see them try their standard platform-learned steps, and if they don't work, you often see them shift in their chair. They furrow their brow. Their voice trails off.

That moment, where you make a user think "Uh-oh, this software is going to be hard", is something to avoid. And it happens a lot when you try to force consistency across platforms.

Can Someone Explain the Use Case?

So if there are clear experience problems that appear when people fall for this groupthinky boardroom-bubble foolish consistency pipe dream, why does it keep coming up? What's the use case? What's the problem we're trying to solve?

I guess the thinking is "when a user moves from platform X to platform Y, we don't want them to be confused." Ok, confused how? Presumably they installed the app with your name on it. The icon and logo look the same. The same core functionality is there. What more do you want?

Remember, this edge-case of a user has bought into a new platform, so if you align with it, you'll make the user's transition easier. Resist it, and your application will stick out like a Java applet.

To use another metaphor, users aren't asking for one-size-fits-all baggy T-shirts. They want tailored. They want custom-fitted. And if you won't give it to them, your competition will.

(A Caveat)

There are apps that do look and feel similar across platforms without degrading the experience. Flipboard is a great example. It started on iOS, went to Android, and managed to stay "Flipboardy" without feeling like an inconsiderate port.

The key to their success has less to do with user expectation and superior execution. At no point in the Flipboard for Android experience does an average user feel like the app is speaking with a thick iOS brogue.

Quantifying exactly why is a topic for another article, but Flipboard pulled it off. Most companies don't.

Further reading:

Mac Word 6.0:
blogs.msdn.com/b/rick_schaut/archive/2004/02/26/80193.aspx

Drawing Inspiration From Roller Coasters

"You know," he said with a grin, "Walt Disney was one of the first experience designers."

It sounded outlandish at first, nonsense, like mashing together an old thing and a modern one just to be retro-hip. Like "flophouse retweet" or "steampunk".

It was a fun discussion. I learned that before Disneyland, theme parks were seedy and dangerous. The concept of a family-friendly amusement park simply didn't exist, but Walt had a vision of the whole experience, not just an idea for a ride or two. He thought maybe he could make things better for people.

The more I've learned about Disney and theme parks in general, the more I've learned about the nature of experience design. How it takes a whole network of subsystems to support a single vision. Business types like to talk about "strong vertical integration", but great experience goes further. It can be awesomely fractal.

For example, you could say, "At my park, I want a family to have a fantastic 8-hour day." That spins up several new conversations around affordability, around appealing to multiple demographics at once, food, bathrooms, transportation, crime, and before you know it you're designing a city.

And even after you've got the city humming along to support this vision, now you have to make it fun. How often can you build new rides? How are you going to handle it when people have traveled all day to try your marquee ride and it breaks down? How can you bring more delight, even as the park gets busier and more complex? How do you deal with lines? Each of these are master's thesis-sized topics.

The whole area, this whole way of thinking, is very fun to think about because you can go as blue-sky or detailed as you want. There's a need for great decisions at every level, so there's no way to run out of fun design work.

I've spent a career in experience design feeling distant from the great artists. I'm technically in the same family but I don't feel closely related. I've realized to be a great experience designer, you don't necessarily need to think like Michelangelo. Think like Disney instead.

Because as experience designers, we work every day, in a series of behind-the-scenes subsystems, to realize a simple vision: we just want to make things better for people.

Andy Crewdson, Where'd You Go?

We all know the story of Harper Lee, the reclusive author who wrote a masterpiece as her debut and never published again. Or Bill Watterson, who put in a tidy ten years writing Calvin & Hobbes before dropping from view entirely. But when Andy moved on, it felt personal.

Andy Crewdson

Andy started a blog called Lines & Splines in 2000 about typography and it was unanimously considered one of the best, most thoughtful blogs ever created.

But after a few years, Andy simply disappeared without any explanation. At the time, we were shocked. These days we call it infocide, and it’s becoming more common.

_why

Years ago, an enigmatic man who went by the name "why the lucky stiff" (_why for short) became a minor celebrity on the web when he (amongst many other things) wrote a beautifully unique comic about writing code.

And then he disappeared, taking all his content with him.

Mark Pilgrim

Then there was Mark Pilgrim, a well-respected online figure, who dove into writing a fantastic online book on HTML5. Thousands of designers and developers grew to rely on his groundbreaking work.

In 2009, his sites were pulled offline. Instead of returning standard 404 errors, they returned the little-known 410 error code, described this way in part:

... the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed.

That is a man saying, in the geekiest possible way, "please just leave me alone." The whole thing felt very David Foster Wallace, so friends leapt into action, asked around, called the cops, and held their breath. Word came back shortly:

Mark Pilgrim is alive/annoyed we called the police. Please stand down and give the man privacy and space, and thanks everyone for caring.

That was a relief, but it still left a lot of questions. Infocide isn't normal. It just isn't done. Right?

The web is tuned for digital extroversion

We often assume that people who produce also want to be famous. Whether they're actors, writers, designers, or developers, they wouldn't put themselves into the public eye if they didn't want recognition, right?

With follows, likes, favorites, retweets, pageviews, reblogs, and comments, we've provided digital extroverts with everything they need to feel validated for a job well done. And it's a great feedback loop, one I'm glad we have. (People said some touching things this week about this site. Thank you.)

But is there a way to design a creation network for digital introverts? An outlet for people who know what they want to say, are compelled to create, and want to share publicly, but without any of the spectacle? Without any of the hassle?

I wonder.

Further reading:

_why's poignant guide to Ruby:
mislav.uniqpath.com/poignant-guide

Remember to Ask Why

I was working hard on my first website. It was 1994. Someone asked if I was going to use an "under construction" animation.

Let's pause here.

It would be easy to turn this into a "Let's all laugh at the hapless guy from 1994 while we feel superior about our exquisite modern taste and perfect hindsight" story. So let me be very clear about what happened next.

I asked why, but not with a grin, or a sneer, or an eye roll. I asked the way you might spot a friend ordering a beer you don't recognize before earnestly asking "Have you tried that before? Should I order one?"

Turns out he didn't have a good answer. He just sort of shrugged, and I skipped the animation idea.

That moment has stuck with me. I remind myself not to be rude about it, but to always ask why. Because, frankly, an under-construction animation sounded SUPER COOL at the time. It and a million other visually rich/usability harming fads comprised the Jetpack Design of the 1994-era web.

Turns out "why" served me very well through the years. Trends came and went, and I considered them all. But if I couldn't explain why I should follow them, I didn't.

Not that it would have been a problem if I had, of course. But by putting each decision through "why", I was able to bring a high level of intention to my designs. It was one of the best lessons I've learned.

Opting Out of Military French Toast

There’s an episode of M*A*S*H where a soldier tries to talk a chef through preparing his mom’s perfect french toast. As the instructions are multiplied to serve an entire mess hall, something is lost in translation.

The soldier describes each step of the recipe until the final one, where the chef is supposed to wait for the bread until “you pinch it with your fingers and it doesn’t bounce back.”

The chef takes the soldier’s careful instructions and multiplies them. He dumps huge canisters of powered eggs and powdered milk into a cauldron, dumps hundreds of slices of bread in after them, and unceremoniously sprays the batch with an industrial-strength hose.

Then he deadpans, “Would you care to pinch it, or shall I?”

It’s a great reminder that some of the best things don’t scale. Designers are well aware that some of the best experiences can be found amongst the small, the intimate, and the heartfelt.

Which begs the question: how can we bring this magic into product design? After all, our customers are used to high impact, mainstream experiences. Can any product compete with the heady rush of camaraderie, of kinship, or of first love?

And how do we balance this insight against the drive to make the biggest possible impact? If ten people love your stuff, why not go for a hundred. Or a million. Why not?

Because sometimes you make military french toast. If you want to change the world, try reducing your audience. See if you can make them feel something. Don’t settle for mere visibility, challenge yourself instead to cultivate love.

A Brief History of Love On the Internet

Robin Sloan recently asked a provocative question: how can we determine what we really love on the internet, versus what we're merely liking?

If we think back a few decades, the best way to describe love of a digital product or experience has shifted quite a bit. For me, it all started with the ballpoint pen.

Pen

Before Prodigy, before AOL, before Yahoo!, Google, and Facebook, before widespread internet, there were dial-up "bulletin board systems", or "BBSes". You'd call someone's house with your phone, horrible noises would come screeching out of the modem speakers, and with luck, you'd connect.

Early on, the best BBSes were recorded the old fashioned way - you'd scrawl their number on a sticky note, a napkin, or the margins of a magazine article. The amount of excitedly scrawled circles around a phone number demonstrated how much you loved the BBS it belonged to.

Bookmarks

Then web browsers came on the scene, and with it this great concept of "digital bookmarking". It allowed people to save great website addresses for later. It didn't take long for early web surfers to overwhelm their bookmark folders.

Social Bookmarks

With the rise of delicious and other sites like it, we began collecting bookmarks as a crowd. It drove a fundamental shift away from curating for ourselves to curating for others. And you didn't need a blog to do it.

[this is good]

Everything changed with [this is good], which I consider to be one of the greatest inventions of the early web. It was started by Filepile.org, and was nothing more than a bit of metadata denoting quality.

Which meant we could uncover good things more easily. By simply sorting by number of [this is good]s, it was easy to find the best content on FilePile.

The Like Button

Facebook's Like Button is the moment when [this is good] went mainstream. The concept of favorites was nothing new, but Facebook was able to push the concept further than anyone else, even opening it externally so any site on the internet could tap into the same flagging mechanism.

Which lead to some interesting side effects: on one hand, we found a de facto standard for liking something. On the other hand, liking something started losing its meaning. Just because a band has a million "likes" doesn't mean I'll love it any more than a band with three.

Pinterest

Pinterest is the most popular of a kind of curation website that counts mlkshk, fffffound!, and zootool as contemporaries. These sites aren't specifically asking if you like something – it's assumed. As such, these sites are goldmines of higher-than-average quality content.

The sites have moved beyond "this is good" and into "this is who I am". Likes are thumbs up you leave all around the web, Pinterest has turned into a place where you express your tastes through a tapestry of found content.

The State of Things

And here we are, with two tiers. There's the infinite sprawl of the Internet that may as well be the universe - too large to fully explore, and expanding exponentially.

Then we have the layer on top: a curation layer, an attempt to pull out the best stuff, a way to separate out the wheat from the chaff. A way to say "I like this stuff more than all the other stuff".

But like our bookmarks folders, like the stack of books beside our bedside table, like our bloated Instapaper queue, our curation layer has gotten overwhelmed. For many people it's so unwieldy that it's begun to lose meaning as a curation layer at all.

Which is why Robin Sloan's thoughts on love are so timely.

Redefining Love on the Internet

Robin put out a must-see iOS app called "Fish: a tap essay". He described it as a

[...] short but heartfelt manifesto about the difference between liking something on the internet and loving something on the internet.

His central tenet is that love on the internet means "something you'll return to", a guideline I find refreshing and clear. It sets a high bar for all of us who try to build extraordinary things. Things that people can love.

Does anyone return to what you do? Would you rather one million people look at your work once? Or do you want one hundred people to return to what you've done? Do you want an vast expanse of followers, or 100 close friends?

A Tale of Two Cars

(Lukas Mathis)

A year ago, I decided that I finally needed to buy myself a car. I came up with some basic requirements: It had to be safe. It had to be practical. It had to be cheap. And it had to be economical.

I researched cars. I looked into fuel efficiency and safety ratings, into trunk volume, price lists, leg room, iPod connectivity, insurance costs, resale value, quality ratings, sound insulation. And I found the perfect car. My test drive took five minutes, I drove the car once around the block, because I already knew that the car did exactly everything I wanted. I didn't need to test-drive it, I knew its specs and stats and numbers. Besides, modern cars all drive pretty much the same anyway, right?

I bought the perfect car — and it sucked.

Sure, it did everything I wanted it to do, but I felt nothing for it. And when I did start to feel something, it was contempt.

At first, I wasn't entirely sure why this happened. This is a great car! It does its job perfectly! What's wrong with it?

Eventually, it occurred to me: the car had no personality. It was a black subcompact, just like millions of other black subcompacts. Its engine was quietly humming away, just like millions of other engines. Sitting in the car, the seat felt comfortable, the windows were placed to allow the best possible overview of the street around me, and the dashboard was unobjectionably pretty. Just like millions of other cars.

My car was obnoxiously pleasing, annoyingly reasonable, and insultingly average.

So I sold it and bought another car. An unreasonable car. My new car is the opposite of what I actually need. It's uncomfortable. It drives like a go-kart. It's a roadster, so there's no trunk. It has a soft-top, so there's no sound insulation to speak of and it sounds as if it is about to take flight as soon as you push down on the pedal. On the highway, you can hardly hear your music playing. Oh, and the car only has room for two people. Slender people. Barely.

And I love it. I love driving it. I love looking at it. I love thinking about it. I even love sitting in a traffic jam, because I'm sitting in the most awesome little car in the world listening to Amanda Fucking Palmer singing about doing it with a rock star, while these poor sods around me are sweating away in their soulless stuffy black subcompacts with perfect safety ratings.

Both of these cars cost about the same. One of them is very useful and usable. The other isn't. And yet. And yet. I love the one that isn't.

But it's also an incredibly annoying car. Now when I go shopping, I have to think carefully about what I buy because there's only so much room. I can never offer to drive people anywhere, because space is severely limited.

These two cars show two extremes of product design, and both of them get it wrong.

The subcompact is a purely rational car. It has no emotional appeal at all. It leaves me cold. I wouldn't ecstatically recommend this car to anyone. I would never rave about how awesome it is to my friends. It does its job, but it does it in the most bland way possible.

The roadster is a purely emotional car. I love it. If people ask me about it, I don't stop talking until they interrupt me (and sometimes, even that isn't enough to shut me up). I love telling people about it. But it can't perform the most basic tasks a car should do. I can't use it to go shopping. If I move house, this car becomes part of the stuff I have to move, not a tool that helps with moving.

When designing a product, we need to strike a balance between emotional appeal and rational appeal. Products that check the most checkmarks on a feature comparison list may be the ones that do the job people want done, but are often not the ones that end up providing the best experience. Checkbox comparisons may tell you something useful about a product, but they can't tell you whether you'll fall in love with it.

Don't aim for perfection, either. Perfection can be annoying. Which of your friends do you love the most? The perfect ones that get everything right and never make a mistake, or the ones that try something outrageous, screw up, try again, often get it horribly wrong, but at least get it wrong in new and interesting ways? If you think about the products you adore, the applications you love, the clothes that appeal to you, the companies you want to support: are they the perfect, bland ones, or the ones with at least a little bit of personality?

During your design process, you spend so much time polishing your product, removing all the blemishes and imperfections and nits and zits. And that's great. Because in reality, most products need more polish, not less. But sometimes, maybe, it's possible to go overboard. Sometimes, maybe, it makes sense to stop polishing before all edges turn into perfectly rounded corners.

We need to design products that work, that do the job they're made for. But people don't make purely rational decisions. Their emotions play an important part. Our designs need to be useful and have emotional appeal.

But if you can't do both, stick with utility. Sure, driving my roadster is emotionally satisfying, while my subcompact was a tool that left me feeling absolutely nothing. But millions of subcompacts are sold every year, while my roadster was discontinued after two years on the market.

The Perils of Designing For a Sometimes Myopic Design Community

It's not hard to describe what will get picked up by design blogs. It should be clean and modern, it should use a typeface we all love, it should be on a grid, the tone of the copy should be informal and hip, there should be an app, and it should be shown off in a cool and beautiful product video with an indie band as the supporting track. Information workers see it all day, every day. We know what "great design" looks like, in the myopic way the design community has defined it. It's become as formulaic and limiting as a romantic comedy.

It's worth remembering that it's not our job to design for ourselves. It's our job to understand and design for all people. But when we spend much of our time with fellow designers, both online and off, we run the risk of isolating ourselves from a broader audience. There's a reason why a lot of "great design" doesn't succeed in the marketplace. It's because we're often in a designer echo chamber that has a set of values that don’t necessarily match what will resonate in the real world.

Which is why I'm impressed by great sales numbers, great customer support, and experiences that are aiming beyond the topic of the day discussed around the virtual designer water cooler. There are experiences aspiring for something greater, by doing their best to leap from the design blogs into people's daily lives. And to do that, your experience can't just be a nice link you pass around. It has to be both well-designed and far-reaching.

If you're like me, you became a builder of things in order to change the world for the better. I love the idea that I can work on a product or an experience and it can affect people. And if I do my job well, I won't just be affecting people, I'll be benefiting them. That's pretty great.

But in order to make a lasting impact, one that really matters, I try not to forget just how insular and myopic the design community can be. When you find yourself on a project where your target demographic isn't like you, it lights up new areas of your brain. It changes your creative process and helps you steer away from the siren song of design fashion and towards the basics: listening well so you can design something tailor-made for your target demographic.

That's where the most novel designs, and where the biggest surprises come from. We don't need another site by designers for designers, we need more great design to hit the mainstream.

Bad Usability for Good Reasons

(Lukas Mathis)

When Apple shipped iOS 6, they replaced their previous Maps app, which used data from Google, with a new app that relied on Apple's own data. In Apple's presentations, the new Maps app looked great, but when people actually got their hands on it, they quickly found out that in many places, Apple's data was significantly worse than Google's.

Where they were previously able to accurately find addresses on the map, they now weren't. People got lost. They missed appointments. And they blamed their iPhones.

Apple blogs were quick to point out that Apple had good reasons for this switch. Apparently Google wouldn't allow Apple to implement turn-by-turn directions in its Maps app without granting Google access to more iOS user data. Predictably, Apple wasn't too eager to give Google this kind of access to its customers.

The same Apple blogs were also quick to point out that it took Google eight years to acquire all of the maps data they now own. Apple, in contrast, was just starting out. Surely, in a few years, Apple's Maps app would have caught up with Google. Of course, that assumed that Google stopped improving its own data, but it's certainly plausible that Apple will be able to narrow the gap in time.

Sure, these are all good points. It's certainly possible to rationalize Apple's decision to replace Google's maps with its own, inferior data, to find reasons for it, to imply that, perhaps, Apple had no choice at all, and to pontificate about how, in the future, things will be much better.

But here's the problem: when I take my iPhone out of my pocket, enter an address, and my iPhone directs me to a point on the map that's kilometers away from where I want to go, I don't particularly care about Apple's reasons behind the change. I just care that, right now, this damned phone doesn't work right.

As designers, we're often forced to compromise. There are business reasons and code reasons and legal reasons that influence what we can do. Maybe something we really want to do is patented. Maybe a design change would be great for our users, but bad for our company's bottom line. Maybe we have an awesome idea for a UX improvement, but it would require wide-ranging code changes that we don't have time for. Or maybe the boss just really loves his leather sofa and wants the app to use that pattern as its background.

And as long as you're honest with yourself, and don't delude yourself about the actual incentives involved, that's okay. As designers, we need to learn to live with such compromises, lest we kill ourselves with a heart attack at the age of 35.

Just be aware that your users don't care about the reasons behind your decisions when they're sitting in their cars, kilometers away from where they wanted to go, with no idea how to get there.

Dogmatic Design

(Lukas Mathis)

As a kid, I read a book about the ancient Greeks. It was mostly about myths and warfare, because that's what little boys like, but there was a short section about architecture, and that's, I think, the first time I heard of the golden ratio. The picture showed a Greek temple, perhaps the Parthenon. Overlaid were lines showing how the proportions between the roof and the pillars on which it rested matched the golden ratio.

The same topic, using the same example, came up again in high school. By then, I had become much more jaded, and it occurred to me that the person drawing the boxes showing the golden ratio on top of the Parthenon had taken a few liberties, cutting off parts of the roof, and adding a bit of the foundation below the columns to make everything fit. That seemed a bit weird.

Nowadays, of course, you can just type "golden ratio Parthenon" into Google's image search, and you'll find hundreds of these pictures. Interestingly, they all add the "golden ratio lines" in slightly different places. Some include the foundation, others don't. Some include the edges of the roof, others don't. Some add additional areas where they see golden ratios. Some pictures are taken from slightly different angles, which changes the perspective and causes the golden ratio to appear in different places. Not everybody seems to see it in the same place.

In psychology, there's a concept called "confirmation bias". It's the human tendency to perceive things that confirm their ideas, and be blind to things that contradict them. I have no idea whether the ancient Greeks used the golden ratio when designing their temples, but it seems clear to me that the people who see golden ratios everywhere might be seeing a lot of things that nobody ever consciously put there.

Worse, the people who use the golden ratio in their designs run the risk of substituting their own good taste with a dogmatic rule. Personally, I have never felt that the golden ratio was particularly beautiful, more so than any other possible ratio.

Thinking about this, it occurred to me that interaction designers often have a heavy tendency towards this kind of dogmatic design. Use a different text color for links, and underline them! Don't use graphics for decoration, only to convey meaning! Add a site description to your site's title! Search must be in the top right corner! Don't offer more than seven choices to the user! Use serif fonts for readability! The homepage must look different than the other pages! Don't be redundant! The passive voice must be avoided!

It's not that these are all bad rules (though some are pretty horrible). They just don't necessarily give the best possible result in every situation. To produce as good a result as we can, we must avoid dogmatic design. People are weird. They behave in strange ways and they like the most curious things. It's hard to predict human behavior. And so it's not a good idea to rely on rules alone. Don't be dogmatic about design. Question your decisions. Test and validate everything you do.

The Uncanny Valley of Assumptions Hidden in Guessware

Assumptions often get in the way of a strong user experience. Capacitive buttons assume I meant to touch them even when I didn't. Voice control assumes what I said and is usually wrong. Most music or movie recommendations are tuned to an average of all people, meaning they almost never appeal to my tastes. I get suspicious when software makes educated guesses on my behalf, because it's so often wrong. And then I have to double my work to fix it.

Which is why I'm raising an eyebrow at all the proposed innovations in smartphone software that revolve around making assumptions. Because smartphones know our current location, the sites we visit, who we text, the Facebook friends we follow most closely, the speed we travel on the highway, when our next dentist appointment is, what's on our todo list, how many hours we spend playing video games, how much we spend online, our home address, and a million other signals, there's a belief that the era of "smart recommendations" is here.

I'm not sure I agree. I think today's recommendations go just about as far as they should, and any expansion of them will feel more creepy or frustrating than genuinely useful.

Why? Because even after analyzing tons of data, software can never do better than a guess. Sure, assumptions can sometimes lead to valid recommendations, but they carry with them a cognitive burden. People want to search for specifics like "good mexican food", not tee up a parlor trick by asking "what should I eat tonight?" and hope the software gets it right.

There's a common belief that the march of technology will solve all problems, that the things lacking in software in this year will be figured out next year, or in ten years. But assumptions, no matter how much technology you aim at them, will never feel completely reliable. There will always be some discomfort associated with them, because they're still just options auditioning for our approval.

Not even my close friends or family can predict what I want to do with reasonable accuracy. There's not an algorithm for what a person may be interested in doing next, and there never will be. No friend can predict it, let alone a computer relying on imperfect data and missing key context.

And even if we could trust the results, and even if we were itching for software to make assumptions about our lives, it still leaves one major issue: privacy. People don't want to be tracked every minute of the day. They don't want a log of every purchase they've made in the off chance that software may be able to present a coupon they may like.

What's the value we're adding? We're asking users to give up their privacy so we can give them imperfect recommendations that they then have to tune and sift through so that maybe we land on something they could potentially enjoy.

Experience designers should focus their magic after information has been requested, and not a second before. We shouldn't be trying to give the user what they want before they ask for it. We should be putting our attention into returning exactly the right thing the moment they ask for it.

If we don't, we risk falling into an uncanny valley of assumptions. Users will want to believe that the latest guessware will make them more productive, more happy, entertained, and informed. But it'll be forever doomed to be just wrong enough, just often enough, to feel like a burden.

When you add recommendations to your software, your first assumption should be "my software won't be right as often as I'd like." Design with that in mind.

[After this was written, Google Now popularized the “When you get to the airport, tell the user their flight is delayed” scenario. A great example of presenting data at precisely the right time, even when a user doesn't actively request it, with little downside. –Ed.]

Bad Tools

(Lukas Mathis)

Have you ever noticed how bad most of the tools we use are?

I bought three different humidifiers until I found one that actually works decently. I have two drills, because when I bought the first drill, I didn't realize that I should probably get something called a "hammer drill", because that's the only type of drill that actually drills into everything you might want to drill into.

Nowadays, when I buy furniture I need to assemble myself, I start out by replacing all the screws that came with the furniture with ones that don't fall apart when I try to tighten them. I'm not sure how much it would cost furniture manufacturers to include working screws with their furniture. Apparently too much, because they don't.

And that brings me to the screwdrivers. If you go to a hardware store and buy the first set of screwdrivers you see, you'll end up with a set of screwdrivers made of such soft metal that the tips are completely chewed off after two uses and you can throw them away. Why would anyone even manufacture something like this?

As a kid, I'd often build things with the tools in my granddad's garage. He had very, very old tools. The screwdrivers had wooden handles and the things were scratched to hell. They looked terribly abused. But they still worked. The tips still had their original form. These were tools that were built to last. You bought them once, then used them for the rest of your life. Sure, they acquired a patina, but they kept on working just as well as the day you bought them.

Let's talk about software. Whenever I need to do something new on my computer, I end up downloading half a dozen applications that claim to do this new thing. Around four of them don't work at all. One usually works, but barely, and (if I'm lucky), one gets the job done well.

If you're going to invest the time into making an application, why not make it one that works well? Why not make the screwdriver that people two generations later can still use and love?

I don't like the term "craftsmanship", because building an application is not the same as building a chair. If it's a reasonably complex application, it's much closer to building a bridge, and you don't have craftsmen building bridges. You have engineers.

But when it comes to taking pride in your work, I think the word "craftsmanship" applies. If you're spending all of that time building something, why not do it right?

Don't be the person who makes yet another one of the four apps that don't work right, serving only to make it harder to find the one that does. If you're not proud of the product you're working on and you don't want to make it the best damn thing you possibly can, you're doing your users, and yourself, a disservice.

Design Alchemy

There was a time when the designer worked in Photoshop while the developer wrote code. They didn't talk much. They didn't usually understand the other's skills and the respect ended at lukewarm statements like "I couldn't do what they do".

Too often, both the designer and the developer thought the other side was holding the product back. The designer was criticized for undervaluing technological constraints while the developer was criticized for undervaluing aesthetics. I've been the developer and the designer in scuffles like these, and I've learned they get in the way of creating great experiences.

But the industry has gotten better at cross-discipline collaboration, and it's improving every day.

When developers and designers blend

Little by little, developers learned some design and designers learned some code. A new kind of product began to hit the market, one that felt delightful, beautiful, fast, and strong. These products came from cross-discipline teams who genuinely appreciated each other, processes that brought developers to the whiteboard for brainstorms while encouraging designers to prototypes their ideas, and an organizational structure that allowed the flexibility to break out of the traditional role silos.

A kind of design alchemy began to take place, where by adding a developer and a designer together closely enough, a completely different kind of experience resulted. A better one. It's gotten to the point where the resulting products have an unmistakable feel. Simply put, if your product isn't delightful, it's a sign your team's various disciplines don't talk enough.

The Marriage of Hardware and Software

We're starting to see the same thing with hardware and software. For decades, Apple's "control the whole widget" approach was scorned, mocked, and safely ignored as it became clear that their business model was inferior to Microsoft's "big tent" ecosystem strategy.

The conventional wisdom was as strong as anything seen in computing: Macs were too expensive, PCs won because they were cheaper, provided more choice and freedom, and benefitted from an ecosystem where multiple partners contribute.

Until the iPod. And then iPhone. And iPad. And somewhere along the way, Macs started taking bigger chunks of the market. Apple made it look easy, so companies took notice. They soon realized Apple's key strength was that they designed the hardware and software to work well together. Apple knew before many others in the industry that there's a 1+1=3 effect that happens when a product's hardware and software are done by a single company.

It took years of struggle, but Steve Jobs' original vision of "computing as an appliance" has found its way into the mainstream. Microsoft's following the same model with Surface. Amazon is shipping their own tablets. So is Google. HP. Barnes and Noble. Even Kids R Us!

There's a Better Way And Your Customer Has Seen It

Meanwhile, the old "one company designs the software, a different company designs the hardware, and together hopefully we'll succeed" model is failing. Is it because it can't realize low enough prices? No. Is it because it can't scale? No. Too few workers, not enough demand, channels, or materials? Nope.

The old model is failing for a more mundane reason: it results in inferior designs, just like when the designer and developer didn't talk to each other. It results in obvious seams in the overall experience, whether trying to get customer support, trying to figure out how to install supported drivers, or even getting basic software running. Google is selling a lot of Android licenses, but Android's customer satisfaction trails that of iOS because it's an objectively worse overall ownership experience.

Thanks to Apple and others, consumers have gotten a taste of a more perfect union of hardware and software, or designer and developer. Our products aren't perfect, of course, but we've made huge strides in product quality over the last decade. And customers can tell.

Like code and design, hardware and software may be different things, but it's the careful marriage of the two that results in the best experiences. That's where the future is heading. That's where the best experiences will be made. That's what customers will demand.

So what are the next two things that will combine to make a larger whole? What are the two items that are being considered as two different things rather than being envisioned as one cohesive unit? Where's the next design alchemy surprise going to come from?

I think it's between hardware and the cloud. We're still treating them as two separate components, and the seams are showing. If history is any guide, change is coming. I can't wait.

Designing for the Cloud: A Manifesto

First things first: "the cloud" is as much a marketing term as it is a true technological breakthrough. It's not as if storing things on servers is new, after all. But as disk space gets cheaper, as people continue to expand the role they're willing to entrust to the the cloud, and as expectations around networked computing get more robust, it makes sense that we'd call it something new.

But while "the cloud" is a newer term, full of boundless promise that we're marketing to within an inch of its life, we're still treating it like a dumb connection to the server. There are two things to remember about designing for the cloud:

First, it should be treated as a fundamental component for hardware, like a hard drive. Second, it cannot be trusted to actually work when you need it to, since connectivity can't be guaranteed.

These two points seem to contradict, but it's through fully embracing both realities that we get some truly new ways to think about designing for cloud computing. It's a fantastic and fun design challenge.

55 Seconds On, 5 Seconds Off

Every cloud-enabled application should have a testing mode where the internet connection disconnects for five seconds every minute. This is the only way to spot out all the problems that emerge when your designs assume an always-on, always-connected, always-fast connection to the internet.

Most mobile applications and websites go bonkers when they get disconnected. One result is the "could not connect" zombie alert dialog which, upon dismissing, appears again. And again. And again. The application is so flustered with the idea that the network connection cut out that it's reduced to a stammering mess.

It's assuming the cloud is reliable, which is the first sign that the experience has not been designed with the realities of the cloud in mind.

Blurring the Line Between Local and Cloud

We currently live in a world with two options: there are apps, and there are websites. Apps can talk to the cloud, of course, but they're still apps. What we're missing out on, experience-wise, is the idea that websites should act more like apps.

For example, if I land on your blog, it should be storing files locally. If I click through a few blog posts, then backtrack back to your front page, it should know I've been there before. It shouldn't have to ask the server for the data a second time. This goes beyond simple caching: if I turn on airplane mode and go back to your site, the page should still load.

Another example is Gmail. If I land on Gmail, it should contact the server to check for new mail, of course. But if my internet connection were to drop out, the site should (and can) continue working exactly the same way. I should be able to compose a new message. I should be able to open anything in my inbox. I should be able to search through my messages. And when I'm reconnected, everything should work as expected. New mail should arrive, outbox items should be sent.

The only difference between being online and offline on Gmail should be a little cloud indicator at the top of the window. Everything else should act normally. The technology exists and the need is clear. The only limitation is that our designs haven’t caught up yet.

Excessive Bandwidth Is Killing Your Site's Experience

When the iPhone first launched in 2007, it brought with it an intriguing and far-reaching detail: the phone came with an unlimited data plan. Unlimited data was short-lived, of course, and now it's hard to find it anywhere in the world. Under this new reality, bandwidth is at a premium.

So our designs need to adjust accordingly. The average load time of a website has gone up significantly, to the point where a 1 meg website is no longer considered a problem. Indeed, it's become the norm on many popular sites. This means that mobile devices are loading pages slower (and taking more precious bandwidth) than ever before.

If your web app isn't making tough design tradeoffs regarding bandwidth, it's not cloud ready. It's been ported from the desktop experience and it shows.

What It All Means

Great design in the cloud era doesn't mean anything particularly shocking or new. It's the same stuff we've always known about, but we may need some reminding.

First, it's still all about speed. If your experience is slow, people notice. They get frustrated. They don't enjoy themselves. Simply connecting to the cloud is not an excuse to let the experience become laggy. Bad design is bad design, regardless of the reason for it.

Next, the Internet connection cannot be assumed to be reliable, cheap, or even present. Wherever possible, websites should try to cache themselves locally, fail gracefully, and accommodate a world where end users are going into bus tunnels, suffering through poor reception, paying too much for data, and just trying to get things done. Alerts that say "internet connection lost" are sure hallmarks of poor design in the cloud era.

Right now we're relying on the cloud like local storage, where a lack of a connection is a shock. You wouldn't design a house that falls apart in the rain, yet we're still designing experiences that fall apart when the network isn't accessible.

A Chance for Differentiation

In hindsight, the iPhone seemed inevitable. Of course people wanted multi-touch. Of course they wanted (and were willing to pay for) high quality apps. Of course they were willing to sacrifice a keyboard for a bigger screen. Of course they'd sacrifice battery life for a better overall experience.

It just took a design team to have the courage to optimize for these assumptions instead of the short-sighted "it's good enough" design we put up with for a decade of cell phone software design. Cloud is in a similar place now. Eventually we'll get it right, and we'll look back and it'll all feel as inevitable as iPhone.

But first some designers are going to get very, very rich being the first to show the world how a real cloud-based application of the future should work. Why not you?

Loss Aversion

(Lukas Mathis)

In their 1983 paper "Choices, Values, and Frames", Daniel Kahneman and Amos Tversky describe the following experiment. A group of study participants were asked to imagine that they currently held the hypothetical Job A. They were informed about the job's salary and workplace temperature. A different Job B was then described as having a lower salary but more pleasant workplace temperature. Participants were asked whether they wanted to switch to Job B.

Most declined.

Here's the interesting part: a different set of participants were asked to imagine the opposite scenario. Imagine that you're currently working Job B, with the lower salary, but more pleasant temperature. Do you want to switch to Job A? Interestingly, most of these participants also said no.

People in both scenarios didn't pick the job they would have preferred in a completely objective evaluation. Instead, they simply picked the job they already had.

Most people prefer stability over change. They value the things they currently have higher than the things they don't, for the sole reason that they currently have them. This is called "loss aversion", and it's something you can often see in software development.

On the Internet there are active communities of professional users of Adobe's FreeHand, a vector graphics application. That would not, in itself, be unusual. There's just one problem: FreeHand was discontinued in 2003. Adobe stopped improving the application, and urged its users to switch to Illustrator, a similar vector graphics application. Since then Illustrator has benefitted from a decade of development, while FreeHand has remained completely unchanged. Yet there are still online communities of people trading tips on how to get FreeHand to run semi-properly on their modern computers.

When Apple tried to remove the Apple menu in Mac OS X, Mac users whined so loudly that Apple put it back, sans all of the features that made the old Apple menu useful. But people were mostly okay with losing the menu's functionality, just not with losing the menu itself. They didn't complain because they actually needed the Apple menu, they complained because they felt that Apple was taking something away from them. They didn't use it now, but perhaps in the future, right? By bringing back the menu, even in its useless state, Apple allayed these fears.

Right now, you can open up pretty much any Windows opinion site on the Internet, and find similar rants about Microsoft's removal of the classic Start menu in Windows 8. The more things change, the more people complain about wanting the old things back.

Keep this in mind when adding a feature to your application: people value new features much less than features they already have. A feature that may not be a huge benefit to many people may turn into something that is jealously guarded by your users, just because they already have it and don't want to see it go. Most of these users may not even use any of the features they so vigorously defend, but they might want to, at some point in the future, and they don't want anyone to steal that option from them.

So, taking things away from people causes them much more grief than giving things to people gives them happiness. Keep that in mind when deciding which new features to add to your product.

Boiling Frogs

(Lukas Mathis)

When Microsoft released Windows Vista in 2007, users were up in arms. Earlier releases of Windows used to be causes for celebration. People lined up for Windows 95. There were lines in front of stores, launch parties, and the TV news reported favorably about the whole thing. Not so with Vista. Most people preferred to stick with its predecessor, Windows NT, and Microsoft was forced to let hardware manufacturers keep shipping new PCs with NT preinstalled for years after Vista's release.

Apple also released a new OS in 2007. This time, it was Mac users celebrating. People lined up to buy Leopard, Apple's sixth revision of Mac OS X.

Why were users ecstatic about Leopard, but unhappy with Vista? One reason for this discrepancy is that Microsoft hadn't released a new version of Windows in six years. Windows NT had come out in 2001. Vista packed six years of features into one humongous update, forcing people to re-learn many of the things they had come to take for granted.

In contrast, Apple released six different versions of Mac OS X between 2001 and 2007, going from Mac OS X v10.0 ("Cheetah") all the way to Version 10.5. This allowed Apple to introduce new features gradually, at a slow pace, allowing people to grow their knowledge of the OS alongside Apple's updates.

Like the (hopefully proverbial) frog that's thrown into boiling water, Vista users immediately wanted to jump out again, while Apple users comfortably sat in the slowly heating water. Admittedly, that's not a perfect analogy, because you don't end up cooked after using either operating system, but the general concept applies.

It's often better to update products incrementally and slowly. Between 2001 and 2007, Apple arguably made as many improvements in Mac OS X as Microsoft did in Windows, but to users, it didn't feel that way, because Apple released updates piecemeal, slowly spoon-feeding changes to its users.

You can do the same.

Releasing often has other advantages. If you're on the wrong track, you'll get feedback sooner. You won't invest years of your life into something that nobody wants.

It's also more satisfying. Nothing is more stressful than seeing people struggle with an older, obviously inferior version of your app, while you're using an updated version that doesn't have any of the issues your users regularly encounter.

Releasing early and often doesn't always work, but if you can make it work, it's an option with many advantages.

Delivering an Opus On a Punk Budget

A pottery class was told they'd be graded one of two ways: either by quantity or quality. The incentive for the quantity group was to make as much pottery as possible, and the incentive for the quality group was to make one great piece. To use a musical metaphor, one group thought like punk rockers, and the other thought like master composers composing a magnum opus.

A funny thing happened. The punk group ended up with both quantity and quality: they completed the most work, but also ended up with the higher quality pieces. It's unsurprising in hindsight, of course. Skills take practice, so the group putting in more hours were destined to do better in the long run.

Unfortunately, It's hard to emulate these results in the real world. Designers doing client work are lucky to get a few days to collect their thoughts, let alone a whole three months to explore. Professional designers are expected to design as close to perfection as possible, as quickly as possible, within a budget, and delivered with a smile.

Designers are expected to deliver an opus on a punk budget.

This is why experience is so important. Early in an designer's career, he or she should be sketching constantly, keeping an inspiration journal, challenging new ways of thinking, forgetting everything and starting over, and finding the courage to be loose, wild, and adventurous. Short bursts of raw exploration, done over and over again, help a growing designer set a good foundation.

If a designer at this level is not experiencing failure, it's because they're playing it too safe. It's likely they will fail later in their career, with much higher stakes, because they've tried to become a perfectionist too early. Perfectionism can be earned later in a career when a creative person sets out to make their opus. But trying for perfection too early in a career often points to a designer who's timid, and too afraid to experiment.

On the other hand, a designer with a lot of experience shouldn't be as sloppy as he or she once was. Not with the client watching. The experiments, the beginner's mind, the wandering through new knowledge is as important as it ever has been, but it will rarely be marked down as billable time. Experienced designers are on the clock, and clients need to feel that they're getting all the brilliance of an opus in their allocated time.

It's about wearing the right hat for the situation. It's about knowing what the client thinks about design work (he's suspicious of it), what he thinks about your rates (they're too high) and what he's expecting in return (a masterpiece). It's also about knowing how creativity actually works. (it's sloppy and non-linear) It's about having a plan.

Here's mine: I'm a punk during expansion, and the master composer during the reveal. It's the contrast between the two that makes for great theatre, and it's great theatre that proves to the client that your idea is a masterpiece.

Let's imagine the deal is signed and you're off and running. Put on your punk hat. You should be going broad, iterating madly, and letting your ideas fail fast. Your project room should look like a disaster but there should be interesting explorations everywhere. Some of them are awful, but that's ok. You're in punk mode. You're in quantity mode. You're not falling in love with anything. And above all, you're documenting everything prodigiously.

You do this as long as you can, until you hear that your deadline is coming. The client will be here in three days, and they're expecting to see some fancy design thinking. They want perfection. They want your opus. That's fine. You've got all your raw material gathered, now you contract by channeling the master composer. This is where your meticulous photographic documentation comes in. It's going to help you find the great ideas, and it's going to be the backdrop for your theatre piece.

Your presentation to the client will be focused on a small handful of brilliant ideas, and you will explicitly call attention to the huge mess of work you used to arrive at them. A lot of it is faked to give a sense of confidence to the client, of course. But you're not lying. You're just putting on a show that represents the years of work it took to be able to recognize perfection when you see it. The mess of papers on the wall behind you is saying "I've put in my time. I've experimented with thousands of different ideas, and when I tell you these three solutions are the ones, you can trust me."

The client will always have revisions, doubts, and concerns, of course. It's part of the process. But if you frame your presentation correctly, the client will know you did your homework. You can use the quantity as a backdrop to make the quality pop. You can use punk rock to make sure they hear your opus.

Solving Problems by Thinking Really Hard

(Lukas Mathis)

Most people consider themselves to be pretty rational. We assume that, if we just think hard enough, we can solve most problems.

In terms of human evolution, this is a pretty novel idea. It mostly started during the times of Descartes. As a rationalist, Descartes thought that reason was the best source of knowledge and that we can get to the truth using our intellect, purely by thinking about things, rather than by evaluating empirical evidence and experience.

Though most people still think as Descartes did, history has shown him to be quite misguided.

Even people whose whole job is to think about things get it horribly wrong. Greek philosopher Empedocles did nothing but think about things. All this thinking eventually led him to conclude that everything was made up of four elements: earth, water, air, and fire. Later, Aristotle thought a bit more, and eventually argued that clearly, Empedocles was wrong, because he forgot to include aether in his list of elements.

In all the centuries of thinking about this, none of these professional thinkers ever came up with the standard model of particle physics. It took scientists to find out how the universe actually works. It took observation and experimentation, experience and evidence.

Empedocles and Aristotle didn't fail because they were stupid or because they didn't think hard enough. They failed because the premise of their approach, that you can find answers to non-trivial questions about the real world by thinking really hard, was wrong. Why? Because the real world is incredibly complex and weird, much more so than we can imagine.

What are things made of? We didn't figure it out by reasoning about it. We figured it out by doing experiments.

This doesn't just apply to particle physics. How will people react to this user interface? Will they understand this text? Will they read it? Is this button prominent enough? Will people be able to use this tool efficiently?

You don't find valid answers to these kinds of questions by thinking about them, even if you try thinking really, really hard. Instead, you need to do experiments. Try something, test if it works, and, if it doesn't, try something else. Repeat until problem solved.

Release

I think a lot about the reputations that creative people have. Most people think of artists and writers as being a little bit crazy. It's accepted that the craziness is what helps them create. But there are prolific, eccentric, brilliant creators who have led happy, successful lives despite having strong creative impulses. So what have they been doing?

I think they've learned the art of release.

I think of it like an electrical current. The shock of inspiration comes quickly and it absolutely must be released somewhere. A creator should let it charge through their body as briefly as possible and then find its home back on the ground. If they hold onto it too long, unnaturally, that's when they can get burned.

There are parallels with sex, too. I read a book where the author described her libido as something that wasn't a part of her moment-to-moment thought process, but when the mood was right, nothing else would do. I don't think it's a coincidence that the feeling of sexual climax is referred to as a "release". We're letting go in that moment, and the feeling is euphoric.

Which is the mindset I have around creativity. When inspiration hits, it's a surge of energy that must be released somehow. Like the athlete who hasn't been to the gym, like the sexually frustrated teenager, like a person holding onto an electrical current too long, failing to release can be tremendously frustrating.

I love all phases of a project. It's fun to start new, to go blue sky and imagine possibilities. I also love the peace that comes in the middle of a project, where the ideas are flowing and everything makes sense. But the end of the project, the release, is my favorite. I don't just love it. I need it.

The Fear

A few years ago, I opted out of writing online. I stopped maintaining my sites, I stopped caring about the latest tech and design news, and I watched from the sidelines.

Eventually I got inspired again. I wrote some anonymous essays, ran them by friends, got a new domain, and launched the kind of site I'd be happy to read. Boom. Within days, it was clear I had a hit.

I've published a lot of content over the years, but I had never experienced the level of attention my little blog got, and I don't expect to experience anything like it again. I hit the pageview jackpot, completely by accident.

And right at the peak of the most flattering response I've ever received, right when you'd think I'd have been thrilled, a funny thing happened: I got The Fear. And nothing kills creativity like The Fear.

Dancing, Drawing, and Deuce

Most people dance the best when they're in the moment, they're feeling the music in their bones, and they're reacting without second-guessing themselves, without caring if anyone’s watching. It's beautiful because it's so raw and pure.

But self-conscious dancing is completely different. It's hesitant, it's restrained, it's usually half a beat off. It can be painful to witness.

Drawing is the same. I like to draw comics, and I know the lines look best when they're thrown onto the page. I practice drawing the panel several times on scrap paper until I can throw the drawing in bold, confident gestures.

If I let my mind go blank, the result is good. But as soon as I slow down, try to get a line just so, or consider tracing the lines, the line wobbles and the sketch loses its soul.

I used to play tennis a lot, and in high school I realized if I let my mind wander, even a little bit, I wouldn't hit the ball well. I'd approach the shot, plant my feet, prepare my racket, twist my body and wrist forward, and-

If a thought managed to force its way in - "I wonder if Anne will be here soon" or "I'm thirsty" or "I really need to study for that test" - then the ball would bang into the rim of the racket, impotently hit the net, or go out of bounds.

But if I could stay focused and let my arm flow through the movement, it'd be a strong shot. I learned the difference between a successful volley or an unforced error was all in the composition of my mind in that moment.

Stifling the Writing

The only difference between volume two of For 100 of Our Closest Friends and the first is The Fear. I'm second-guessing its quality. I'm over-thinking how you'll feel reading it. I'm missing all the publicity I got on the blog. I'm wondering if anyone liked volume one, or if they did, if they'll be disappointed in this one.

If the last volume had a lot to do with love, of cultivating authentic experiences, of going deeper rather than broader, this one is grappling a lot with inspiration. With fear. Of getting stuck in your head, admitting you're stiff-legged and awkward on the dance floor, and doing something about it.

The Fear isn't your friend, and it's not kind. It's not going to get up off your chest after it thinks you've had enough, just to pity you. Instead, The Fear will stay on you, stifling your breath, until you finally get tired of it, until you bolt up, push it off, scream it away, and reclaim yourself by asserting something in its place.

Here's Volume Two, Damn It

I'm wishing I had never heard the term "sophomore slump". I'm wishing I were aflame with burning ideas, inspiration coursing through my veins, ideas swelling and dripping from the ends of my fingertips like rain. I'm remembering wistfully how I danced like no one was watching, back before anyone found my little blog.

And I'm realizing it was that unconcerned and intuitive posture, that lack of The Fear, those pure strokes of inspiration that sent sixteen shots just over the net, perfectly placed, over the summer.

I'll get back there again. This is me fighting back, a word at a time.

Wearing Computers

(Lukas Mathis)

Books used to be incredibly precious. Each individual page of each book had to be hand-written by a scribe. Only the richest people could afford to own any. Smaller monasteries had only a few of them. Books were so expensive that by the end of the Middle Ages the papal library in Avignon only had about 2000 books.

As a result, only the most precious books were copied. And monks did much of the copying. Basically, this meant religious books.

In 1450, Johannes Gutenberg invented the printing press, and re-invented movable type (it was previously invented in China 400 years earlier, but never made it to Europe — something unthinkable in today's connected world, but still pretty common even half a century ago). Movable type allowed typesetters to create "formes" — essentially a huge stamp that would be used to print an individual page. The forme is mounted in the printing press, coated with ink, and pressed on paper to create a printed image.

When printing presses were first invented though, they were not used to print books, much less the kinds of books we find in today's bookstore. Like the monks in the monasteries, Gutenberg used his press to print religious texts: church documents, papal letters, bibles. It took until the 17th century for people to discover that they could do new things with printing presses; things that weren't possible when you only had scribes. Newspapers and the kinds of modern novels we take for granted nowadays eventually became popular — more than 150 years after Gutenberg first got the ball rolling.

In the 1880s, the movie camera was invented. At first, it was used to film simple, static scenes. People in a garden, a person sneezing, a train. "Movies" were less than a minute long, presented by traveling exhibitors. Then people started to film plays, and eventually, they discovered that they need not be constrained by the rigidity of theater. It took less than 30 years for stop-motion animations, moving cameras, continuous scenes with cuts, and other modern movie techniques to become popular.

Getting it Wrong

We always see new technology in the context of our existing technology. The printing press was used to print bibles. Movie cameras were used to film theater productions. It takes a while for our brains to adjust, for the new possibilities to become apparent. At first, we usually get it wrong.

But it also seems that this timespan is getting shorter. It took 150 years from the printing press to the first newspaper, "Relation aller Fürnemmen und gedenckwürdigen Historien", but only 40 years from the movie camera to D. W. Griffith's "The Birth of a Nation".

When it comes to wearable computing, we are now squarely in the "we have it, but we're getting it wrong" phase.

Today all of us are walking around with at least one computer on our bodies. The phone you carry with you is more powerful than the most awesome supercomputers I read about in magazines twenty years ago. Your phone would utterly destroy a Cray X-MP, a machine that would fill your bedroom, and was world's fastest computer in the early 80s.

If you're wearing a watch, it's likely that it is more powerful than the first PC I owned. Even your Fitbit has more processing power than the first videogame console I owned.

We aren't wearing fancy glasses with built-in screens, but we've definitely entered the age of wearable computing.

When you listen to people envisioning the way wearable computing will work, they often depict a crazy future where we're constantly being barraged by notifications and information and ads. Google Glass detects what we're looking at, and offers additional information. Oh, I see that you're looking at a poster for a concert, do you want to buy tickets for that?

[This article was written before anyone had tried Google Glass, and all indications were that Google was planning on exactly these kinds of scenarios. –Ed.]

Can you imagine how long it would take for people to go from, "Wow, that's neat" to "Shut up! How do I turn this crap off?" I'm betting less than a minute. This doesn't sound useful; it sounds stressful. The Google Glass video is the equivalent of the traveling exhibitor, putting on a movie of people standing around in a garden. I don't want to see that. Nobody wants to see that once the novelty is gone.

We're imagining that we'll use wearable computers in the same way and for the same things as we use our laptops and iPads. We use Wikipedia on our computers, so how about a pair of glasses that shows you Wikipedia entries for the things you're looking at?

We won't use wearable computers for these things. We'll use them for new things.

I do think, however, that we're seeing the first hints of how we're going to use wearable computing.

Passive Tracking

Right now, the most common thing we use wearable computers for is tracking. We track ourselves with a Fitbit or Nike FuelBand. This data can be correlated with that of our scale and blood pressure monitor; at the end of the month, we get a nice graph, depicting how well we're doing.

Are you gaining weight without noticing? Have you been slacking off? Is there a correlation between these two things? Passive trackers don't get in your way, but do help you live a healthier life. In the future, we'll be able to measure more, better. Our computers will know what we eat, what our blood sugar is, what our average heart rate is. It will be able to give us specific suggestions for improving our health.

Gamification

When I go jogging, I run from Zombies. "Zombies, Run!" on my phone turns running into a real game, with a plot, different levels, and increasing difficulty. It makes running enjoyable, something I look forward to.

Since we always carry these devices with us, why not use them to make everyday life more enjoyable? Why not use them to help us keep our resolutions, and to turn us into better people?

Human psychology is easily manipulated. Right now, it's other people who are manipulating us. The ads we see every minute of every day, the unhealthy way women (and men) are portrayed in the media, the way stores are laid out to make us buy the most useless crap in the shortest amount of time: these are just three examples of how we are being manipulated by the media, corporations, and society.

Having a computer constantly with us allows us to tilt the balance a little and to use this kind of manipulation for good. Instead of other people manipulating us into doing and thinking things we don't want or need, we can manipulate ourselves into doing and thinking the things we actually want to do or think.

And maybe, one day, when we all wear augmented-reality glasses, perhaps we'll even have ad blockers for real-life, physical ads.

Social

Humans are social animals. We require connections to other people. Modern technology has allowed us to be connected to more people, more easily, and wearable computing emphasizes this further. It also makes it more personal. Being able to have video chats, or send movies of what's happening around us to our friends can allow us to take part in other people's lives in a way that wasn't possible before.

There are apps that notify you when friends are close, making these kinds of serendipitous encounters more likely. This may be a slight loss of privacy, but it's a loss of privacy we control. We decide who gets to know where we are.

We require human interaction. Wearable computers will help with this fundamental need.

Scripting your Life

When I'm near my plants and they need water, my phone tells me. When I leave my car keys in a restaurant, my Android device beeps at me, telling me to go back and get them. When I arrive at my car, it automatically opens a special user interface for easy access to apps I commonly use in the car. When my washing machine is done, I receive a message. When I need to take an umbrella with me, my phone tells me as I leave the house. When I'm running to the train because I'm late again, my phone has already bought the tickets I need.

Since we always have these devices with us, they know a lot about us. We can use this knowledge to automate things we regularly need to do anyway, to remind us of things we tend to forget (computers are far better at remembering things than humans are) and to generally be the kind of personal assistant that people imagined when they first thought of portable computers.

Enhancing your Mind and Memory

When I was a kid, people used to tie a knot in their handkerchiefs if they wanted to remember something. When they noticed the knot, it reminded them of whatever it was they wanted to recall. I bet most kids alive today have never even heard of that concept. The devices we carry with us remind us of what we need to know, when we need to know it.

Similarly, people used to have expensive sets of encyclopedias. Dozens of weighty books that contained all kinds of nuggets of information. When people needed to learn about something, they'd start out browsing through their encyclopedias. Today, we carry all of mankind's knowledge in our pockets.

In the future, computing devices will augment more of our existing capabilities. Not in an intrusive way, but in the same way they already do: when we need it, how we need it.

Piecemeal Popularization

When I watch the Google Glass video, my blood pressure starts to go up. This seems overwhelming and overbearing. I don't want my life to be a constant stream of interruptions. I don't think that we will one day buy a pair of Google Glass goggles, and suddenly be in the future.

Instead, wearable computing will arrive piecemeal, one tiny, useful nugget at a time. This process has already started. Our phones help us find our parked cars. They allow us to get directions. Maybe one day we'll add glasses to the mix, but even then these glasses will start out modestly, perhaps a tiny screen in the corner of one eye that supports very few useful features. Over time, they will become much more than that.

Wearable computing is in the future, but it's also in the present. It has already started, and we're starting to see the direction it's going in. And I don't think that direction includes constant popups asking us if we want to buy tickets for every concert poster we look at.

50 Words

Every successful project in my life has started small. Not only that, the projects that have made the biggest impact were the ones that started even smaller than the others. This not a coincidence.

Stephen King is often asked how he's able to write so much, and his stock response is "one word at a time". He's not being coy. That's actually how it's done, both literally and in terms of mindset. One word at a time leads to whole pages, which lead to chapters, and if you can keep it up, one day your book is done.

The problem is that people too often aim at goals that weigh them down. Writing a book, running a marathon, designing a new application, website, product or feature. These are all good goals, but you probably won't achieve them unless you start small.

Let's say you want to write more. For starters, promise you'll write 50 words a day. Yes, just 50. So when it's 11:47pm, you're tired, and you just want to go to bed, you're setting yourself up for success because the goal is almost comically small. Sure, you could still skip it. But 50 words? Ah, you might as well follow through with them.

And that's the worst-case scenario, when you're not inspired. The days where you are inspired will be magical. Say you wake up, jot down your 50 words, and they're pretty good. So you keep going, and not out of a dull grey obligation, but feeling the bright illumination you can only feel from true enjoyment.

Pause right there. That's the secret, but if you blink, you might miss it. You need to find some kind of enjoyment in a task or you will not do it.

Small goals let you hurdle over the obligation aspect and let you get at the joy of the activity that much faster. Maybe 50% of the time, your 50-word goal is a chore. But the other 50% of the time, you're genuinely enjoying it. And when you enjoy it, words will turn to pages. Pages into chapters. Then you'll have your book, or any other goal you're working towards.

Don't set a goal of a marathon, start with a goal to run around your block every other day. Don't set a goal of applying to 10 grad schools, set a goal that you'll read the brochure for one grad school a night. Don't set a goal to finish all your wireframes in a single day. See when deadlines are coming and police your time so you can give yourself 20% of the work over 5 days.

The way to finding your creative muse is to reframe them as a bunch of small, achievable, fun steps. If you can do that, the rest will take care of itself.

McDonald's Theory

I use a trick with co-workers when we’re trying to decide where to eat for lunch and no one has any ideas. I recommend McDonald’s.

An interesting thing happens. Everyone unanimously agrees that we can’t possibly go to McDonald’s, and better lunch suggestions emerge. Magic!

It’s as if we’ve broken the ice with the worst possible idea, and now that the discussion has started, people suddenly get very creative. I call it the McDonald’s Theory: people are inspired to come up with good ideas to ward off bad ones.

This is a technique I use a lot at work. Projects start in different ways. Sometimes you’re handed a formal brief. Sometimes you hear a rumor that something might be coming so you start thinking about it early. Other times you’ve been playing with an idea for months or years before sharing with your team. There’s no defined process for all creative work, but I’ve come to believe that all creative endeavors share one thing: the second step is easier than the first. Always.

Anne Lamott advocates “shitty first drafts,” Nike tells us to “Just Do It,” and I recommend McDonald’s just to get people so grossed out they come up with a better idea. Once I got an email from Steve Jobs, and it was just one word: “Go!” Exactly. Dive in. Do. Stop over-thinking it.

The next time you have an idea rolling around in your head, find the courage to quiet your inner critic just long enough to get a piece of paper and a pen, then just start sketching it. “But I don’t have a long time for this!” you might think. Or, “The idea is probably stupid,” or, “Maybe I’ll go online and click around for—”

No. Shut up. Stop sabotaging yourself.

The same goes for groups of people at work. The next time a project is being discussed in its early stages, grab a marker, go to the board, and throw something up there. The idea will probably be stupid, but that’s good! McDonald’s Theory teaches us that it will trigger the group into action.

It takes a crazy kind of courage, of focus, of foolhardy perseverance to quiet all those doubts long enough to move forward. But it’s possible, you just have to start. Bust down that first barrier and just get things on the page. It’s not the kind of thing you can do in your head, you have to write something, sketch something, do something, and then revise off it.

Not sure how to start? Sketch a few shapes, then label them. Say, “This is probably crazy, but what if we.…” and try to make your sketch fit the problem you’re trying to solve. Like a magic spell, the moment you put the stuff on the board, something incredible will happen. The room will see your ideas, will offer their own, will revise your thinking, and by the end of 15 minutes, 30 minutes, an hour, you’ll have made progress.

That’s how it’s done.

Hidden Complexity Isn't Simplicity

(Lukas Mathis)

One of the more nebulous goals we strive for is "simplicity". It's not always entirely clear what that means. Does a simpler app need to have fewer, simpler features? Or is it enough to just make things look simple, hiding away the actual complexity behind a veneer of straightforwardness?

When you're rethinking a complex application, it's easy to fall into this trap of superficial simplicity. Take iTunes. This is an application that came out in 2001, and even back then, was based on a longstanding Mac media player called SoundJam MP. When iTunes came out, people loved it. When Apple's designers turned SoundJam into iTunes, they removed a ton of features, and managed to create a simple, yet sufficiently powerful media player.

Time hasn't been kind to iTunes.

In the last ten years, Apple came out with the iPod, the iPhone, the iPad, the Apple TV, and a bevy of other services and gadgets. And for some reason, Apple decided to use iTunes as its integration point for all of these things. Consequently, iTunes has sprouted a complex device synchronization feature for iPods, iPads, and iPhones. And a fully-grown app store. And cloud integration. And a music store. And a movie player. And an ebook management system. And audiobooks. And much more.

Today, iTunes is an unwieldy mess.

Tasked with the ungrateful job of simplifying iTunes, Apple's designers took the cheap way out: they decided to not simplify it. They could have, had they wanted to. They could have broken out the App Store and the synchronization feature. They could have removed a ton of the crap iTunes has accumulated over the years. They did not. Instead, in iTunes 11, they hid the sidebar, and replaced it with a tiny header. Also, instead of a list, the default view now shows your music albums.

(Note that they couldn't even bring themselves to actually remove the sidebar. Instead, they merely hid it, and provided a setting for getting it back. Even when attempting to simplify iTunes' looks, they managed to just add more crap, without removing anything.)

iTunes 11 looks simple. But in making it look simple, Apple had to hide much of iTunes' complexity. This is not an improvement; the complexity is still there. But now, it's lurking below a pretty, but shallow façade, just waiting for the right moment to jump out and bite your face off.

In the old iTunes, you could see the complexity, and try to understand what was going on. Now, it hits you when you least expect it.

Let's say I want to go to the store. Ah, there's a button called iTunes Store. Wait, that's the wrong store; I meant the App Store. Or is it really the wrong store? Actually, App Store is a section of the iTunes Store. It's a bit of complexity, hidden away, so you won't find it when you need it. They turned two things into one thing, but it's a sleight of hand; there are still two things, but you can only see one thing.

So now I'm in the App Store, but I want to go back to my music. Let's click on "Music". Nope, I'm now in the music section of the iTunes Store. Another bit of complexity that just surfaced again. Ah, I can probably click on "Library" to go back. Oh, Library is a menu. I want to see my Music, so instead of just clicking on "Music", I have to select "Library" -> "Music".

Now I want to go to my Apps. Let's go back to the Library menu... nope, it just disappeared. To go from the Music section to the App section, I now have to click on "Music".

To reiterate: if I'm in the store section, I don't click on "Music" to go to my music, I click on "Library". But if I'm in my music section, I click on "Music" (which is on the opposite side of the screen from the previously-clicked "Library" button) to go to my Apps.

That's hidden complexity, surfacing again. It might look simple, but it's not.

The logic of how an application actually works is called its implementation model. The way the application presents itself to the user is its UI model. The user looks at the UI model and forms her own model of how the application works. That's the user's mental model. The closer the user's mental model is to the application's implementation model, the better she is able to use the application.

In attempting to hide iTunes' complexity behind a simple UI, Apple introduced discrepancies between how the app pretends to work, and how it actually works, between the UI model and the implementation model. This prevents the user from forming a correct mental model. It makes it hard to learn how iTunes works, because everything seems like a magic show. Menus appear and disappear, they do different things at different times, they combine different things that should have individual UI elements.

You can't learn how magic tricks work by watching a magic show. Likewise, you can't learn how an application works if you're looking at a user interface that, like a magician, intentionally misleads you about what is actually going on.

The trick to creating a truly simple application is not to make the UI look as simple as possible. It's to minimize the discrepancies between how an app pretends to work, and how it actually works.

Data Design

(Lukas Mathis)

When designers talk about their process, they often talk about things like sketching and wireframing and usability tests. But it occurs to me that this is not what I usually start out with. The first thing I typically design is the application's data model.

What kinds of things are there in the application? What properties do the things have? How are they related to each other? Can we normalize this structure, make it simpler? If the application grows, can this model accommodate the changes?

Recently, I had a very preliminary design meeting about a website that would help people organize soccer matches. This seems like a simple kind of application. You probably have users and teams and matches. Users belong to teams, and teams participate in matches. Well, you probably also need to have events, if there are several matches at the same event.

But wait, if you have events, doesn't that mean that you might not know all of an event's matches beforehand? Maybe the event has some kind of run-off system where the winners of a set of matches play against each other, so the participants of that match aren't known in advance. Okay, let's drop that for now, but still try to design the system so that we might be able to support something like this at a later date.

So a typical use case would be for an organizer of an event to create a new event, add some matches, add teams to the matches, and add players to the teams. But some teams probably already exist in the system; perhaps the team members recreated their own teams in-system. Wait, we probably need to let players create their own accounts. But if they do that, can they choose which teams they want to belong to? Or can only team creators invite players to teams? What if a player isn't yet in the system, but the person who created a team added the player to the team anyway... we need to support something like this, but can the player then claim the spot in the team? What if different people added the same person to different teams, each creating their own player; can the person then consolidate these things into their main account?

All of these questions come down to model design. What are the basic entities in the system? How do they relate to each other? This is the first thing I worry about when designing an application.

You might think that it's not really a designer's job to do that; let the programmers, who very likely attended courses teaching data model design, deal with that stuff. You'd be wrong. The model fundamentally defines how an application behaves, what kinds of features it can support. If your model is an afterthought, if it's inconsistent with the user interface, if you didn't communicate everything the app needs to do to your programmers, or if the programmers were lazy and didn't do the model design correctly, your application will never work right.

Start out with the model, and keep it in sight during the whole design process. Collaborate with programmers, but don't let them take it over. It's part of your job.

Laying a Good Foundation

(Lukas Mathis)

The house I live in has a bike rack for visitors. It was built alongside the house, solid concrete and metal. People in Switzerland often travel by bike; it's not unusual to have these covered racks attached to houses, so people have a place to temporarily park their bikes. Our bike rack, though, has two problems. First, it's about 100 meters away from our entrance (that's about 300 feet, for those people still living in the Middle Ages and measuring distance in terms of their body parts). Second, the bike rack is not visible from the entrance.

Lack of convenience and lack of discoverability have conspired to make this bike rack completely useless. In the two years I've lived here, I've never seen a single bike in that rack. Instead, people just park their bikes right in front of the house.

A few weeks ago, I was sitting in a café, talking to an architect friend of mine. I was thinking about our useless bike rack, and a question occurred to me. "How do you know the house you built will actually work?" I asked.

"What do you mean?" she replied.

"Well, if I design something, I do all kinds of tests first to ensure that people will get it. For example, I might build a simple prototype, and let people use it, to find out if they can actually figure out how to use it, if they like it, if it does what they expect. But you're building a house, you can't build a prototype house and let people live in it for a few weeks, to see if they like how the rooms are laid out. So how do you do it?"

"In time, you develop a feeling for how things should work."

"They didn’t teach this? Did you have classes on where stuff should go so it will actually be where people expect it, and convenient?"

"Specific classes for this topic? No. I guess we talked about some of these things."

Well, that certainly explains why many houses are built with such shitty design mistakes. The bike rack is too far away. The built-in coat rack is too small for the number of people who might live in a place. The bathroom window is in an inconvenient place. The door to the toilet is placed awkwardly. Buildings where you have to walk twice as far than you would if they had just built the damn stairs slightly differently. Parking space right in front of the entrance, ensuring that you always have to find your way through a bevy of cars to get to the door. Too few electrical wall outlets, placed on the wrong wall.

Rooms, doors, windows, walls, stairs, outlets: it's so easy to put them in a place that doesn't match how people actually use a house.

In comparison, software developers have it good. If something doesn't work right, if we didn't catch it during design, it's almost always possible to fix it later.

But not always.

Sometimes, you can't change the existing database layout, because so much depends on the way it currently is. Sometimes, you can't change a feature, because the file format depends on it, and you want to keep being compatible with earlier versions of the software. Sometimes, you can't move the foundation without shaking up everything you've built on top of it.

And this is why it's important to really get the foundation right. Spend a bit more time doing mockups and prototypes and sketches. Spend a bit more time arguing about the database layout, about how extensible the file format is, about whether you really should allow outside access to your API with the first release, or wait a bit to see if there are any changes you'll have to make that break how the API works.

The earlier you fix a problem, the cheaper it is to fix it. Even though we're not architects, and even though we can go back and change things, it's still best to get them right on the first try.

The Lesson I Learned in Plano

The guidance was clear. We were designing for a profession where browser plugins were not supported, upgrades lagged by years, and security was paramount. In some cases, no browser was available, so the only option was the command line.

Imagine my surprise when I saw what we were pitching to the client: infographic-heavy dashboards animated in Flash. Using techniques that were about four years too cutting edge and too clever by half. What we were presenting looked nice, but it was a complete mismatch with what the client and the target demographic would actually be able to use.

The 50 field interviews we conducted clearly warned against this kind of approach. We kept writing and underlining phrases like "no plugins?!" and "very conservative upgrade cycle" in our notebooks, and over every lunch, every client meeting, and every internal discussion, we debated how we'd deal with these factors. It was an immovable reality. We knew it. The client knew it.

And yet here we were, pitching the big vision anyway. So what happened? What were we doing advocating for a design that we knew wouldn't work? What was all the ethnographic research for, if we were going to design as if we hadn't done any?

I pulled one of our clients aside and asked what was going on. Was Flash suddenly possible? No. Did they realize we were pitching things only possible in Flash? Of course. Was that a problem? Not at all. All that mattered to the client was one simple word: inspiration.

He explained that he needed to be able to show sexy new thinking to his superiors. He needed to show that he was clever for hiring us. He needed to get people in his company thinking more about the future, about what could be possible, not what was currently possible. He needed us to paint him a picture of tomorrow, not merely a prettier vision of today.

The way he saw it, his field was stuck in the dark ages, and he had two options. He could inspire people with brilliant futuristic thinking, or he could let things stay the way they were, perhaps forever. Or at least until a competitor shipped something first, and he was left scrambling for an answer.

As hired designers, reality wasn't our responsibility. We weren't there to write the code, ship the product, or accept facts. We were there to inspire. We were the upstarts trying to spark a revolution. We were the change agents, the unreasonable ones, the ones asked not to let reality stand in the way of a potentially lucrative new way to think about this unique market.

Our designs never shipped as software. But our audacity, our coached refusal to accept the status quo, our drive for change had the desired result: we inspired. We affected change. Once we realized what we were there for.

That's the lesson I learned in Plano. Some design jobs aren't about the design. Sometimes the designer's role is simply to start a conversation. As strange as it can feel for a traditional production-oriented designer, sometimes inspiration is the deliverable.

All or None Design

Shepherding design thinking from idea to a shipping product is hard. The project may be canceled, all the best ideas may get cut from it, or it may be buggy. There are a million ways for the product to stray from the original vision, if it gets released at all.

This puts designers in an interesting position. They're expected to come up with big, bold, outside-the-box thinking. They're expected to shatter the paradigm. The company can't afford them to have them around if they can't invent The Next Big Thing, so they shouldn't artificially limit their vision. But on the other hand, nothing in a team ever gets done without compromise.

Designers should be able to dream big, but be ready to negotiate in good faith. We spend a lot of time training designers about the dreaming part, but not enough time on how to fold technical and business realities into a design in the most graceful possible way. And it's all about attitude. Designers can choose to accept compromise as "settling on a plan B" or "revising their first draft".

Or they can reject compromise outright. But that dooms them to "all or none" design, where a designer dreams up something great, and then grumbles under their breath as the client, the developers, the marketers, and reality chip away at it. This approach isn't effective. Designers shouldn't wait to see their designs get watered down, they should anticipate business, technical, and marketing compromise and look forward to the inevitable negotiations.

When a designer pushes back against feedback, I watch carefully. There's a fine line between championing your point of view and missing an opportunity to refine a vision into something better. Technical limitations may be frustrating, but so are yield signs and warning cones. They shouldn't be ignored outright.

I think we need to teach a slight tweak to the "present multiple explorations" model. We should still know how to expand, how to think in multiple directions and present a strong design rationale for each. But we'd do well to know how to "degrade gracefully", the way web developers do.

We shouldn't just present options 1 through 3. We should know what each looks like if it can do 10%, 20%, or 30% less. We should know what's required for version one, and what can wait until version two. A designer should know what truly breaks the design, not just insist on the whole design shipping precisely as proposed. A vision cannot collapse into dust after a small adjustment. Designers that assume otherwise are inexperienced, naive, or arrogant. Perhaps all three.

Modern product design is an ongoing discussion, a grand compromise across an array of teams and different factors, almost all of which are beyond the designer's control. Great design needs to be able to flex while still staying true to the original insights and opportunities identified. Great experience design cannot be all or none. If it can't bend, it'll break.

Airports Made of Cake

While studying Albert Einstein's brain, scientists discovered an abnormality. He was missing an area called the Sylvian fissure, which some theorize led to his non-linear way of thinking. Einstein believed his greatest advantage was his ability to think non-traditionally about problems and their possible solutions, not sheer mental horsepower.

Years ago, I was taught the principles of "lateral thinking", which are designed to encourage exactly the kind of breakthroughs that great thinkers are able to realize. So I use these techniques when I'm leading a group brainstorm to try to recreate some of the Einstein magic. The exercises are simple, effective, and fun, yet I rarely see them in the workplace, even in creative fields. So here are a few pages from my playbook.

Call It Design Play

If you stand in front of a room and say, "we're about to think outside of the box", you'll probably be booed. Instead, simply say you'd like to lead the room in a round of design play before kicking off the official brainstorm. Labels matter, and "Design Play" has always worked for me.

Make It Fun Immediately

Breakthroughs happen when people are comfortable and happy, so the first step in any brainstorm is to lighten the mood of the room. It shouldn't feel like work. No one should feel like they have to look cool. I've found that people are very receptive to things that remind them of being a kid again.

"What If Airports Were Made of Cake?"

I always lead with this question while describing how design play works, with a bit of a preamble about how despite being a sort of goofy activity, this kind of thinking does result in good ideas. Something like:

"I'll walk you through the classic lateral thinking example, and how it actually resulted in some interesting insights. What if I were to ask you what would happen if airports were made out of cake?"

Notice how the room reacts to this. There's going to be a blend of disbelief, a bit of awkward hesitation, and in my experience, some smiles. Then a joke usually comes out, like "really delicious disasters?" followed by laughter. Then another joke. If the jokes don't come, I make them myself.

The mood of the room, in my experience, immediately changes for the better once the laughs start coming. It's not as casual as grabbing a beer after work, but it's not like a standard meeting, either. It inspires a third and too uncommon atmosphere, one that reminds me a lot of elementary school. And for most people that's an awesome feeling.

Stress the Results

But it's not just about getting people laughing. This kind of thinking does work. Here's what I say once the room gets their giggles out:

"It sounds totally crazy, I know. But we're all laughing, we're enjoying ourselves, and that's the moment where good ideas get made. Because somewhere between thinking we're wasting time and making jokes, your mind shifts into a creative mode.

So that's why it's important to talk about odd things like airports made of cake. Someone will inevitably talk about the floors being soft and padded, which can lead to statements like 'it would be the quietest airport ever!'"

And bingo. That's the moment of insight.

Legend has it that this exercise did lead to a new airport being built with acoustics in mind, in order to be a more pleasing traveling experience. And that's a great example of how lateral thinking can go into outer space but then return with a powerful, actionable insight.

Get The Wheels Turning

To begin the brainstorm, I give everyone a pad of sticky notes and a pen, and instruct them to write everything they could possibly do with a brick. I give them 5 minutes.

Everyone's first few sticky notes are the same: throw the brick, prop open a door, make a house, etc. But as the time goes on, people will find themselves forcing their thinking further out. This is where the mind can stretch and come up with novel new ideas.

Share

Then everyone should pick one or two of their favorite ideas to share with the group. It's fun, and it stresses the point that your first ideas are usually not very good. There's usually more laughter as camaraderie grows.

Dive In

This is where the brainstorm diverges. Sometimes I lead another "what would you do with a brick” activity but I angle it more towards the actual product we're designing, and give people more time for the sticky notes.

Other times I ask people to imagine they're famous characters, like "How would Jay-Z design this? Mario? Michael Jordan? A toddler?" And at the end of the day, sometimes we rank our ideas, or map insights, or leave it open for a second brainstorm. It varies.

It all depends on the project, and the insights we're trying to get to. But one thing is the same, no matter how many times I do this, regardless of the client, the team, or the product. People love thinking creatively in design play. Sometimes it's just your job to build a temporary space for them, just long enough to get to the real work. Sometimes it's your job to make it ok to think about airports made of cake.

Consistency Considered Harmful

(Lukas Mathis)

"Remove these two buttons and put them into a menu."

"Why?"

"This is the only screen with buttons. It's more consistent if it doesn't have buttons."

When people talk about what makes for a good user interface, they often mention "consistency". This app is bad because it's inconsistent, but this one is good because it's consistent. There are many different things an app could be consistent with, but usually, the two major ones are the host system, and the app itself.

Consistency with the host system is often a good idea because it helps people apply what they learned in one app to another. Internal consistency is often a good idea because it makes it easier to learn how to use an application.

But consistency can be dangerous, because it's so easy to apply, but often doesn't give the best possible results.

If you're working on a new feature, it's tempting to take a feature of your app that's kind of similar, and just do it the same way. Surprisingly often, this is a horrible idea. Designing your new feature so it works exactly like an existing feature may mean that it doesn't work as well as it could. What's more, designing it the exact same way communicates to the user that it will behave in the exact same way.

Your new feature probably isn't that similar to your existing feature. Most things are not the same; if they were, you wouldn't need both of them. So don't default to using the same user interface for different things.

Maybe this is the only screen with buttons, but maybe that's because it's the only screen where the user triggers important actions. So even though all other screens hide their actions away inside menus, it makes sense to put a pair of huge, obvious buttons in front of the user on this particular screen. It's not consistent with the other screens, but it's good design.

The hammer of consistency is often used to pound good design into a pulp of mediocrity. In the end, what matters is how well a user interface works, not how consistent it is with some other user interface.

Design the best solution for the problem at hand; don't start out by copying a solution for a different, similar problem. If consistency truly helps make a user interface better, if you end up concluding that the two really are similar enough that they should look and work the exact same way, awesome. But don't start out with that assumption. Don't be consistent solely for consistency's sake.

Figuratively Literally Language

(Lukas Mathis)

People love to complain about language. "That's not a word!" "Don't end a sentence in a preposition!" "Don't use the passive voice!" "Really? How high did the show have to literally jump to get over the shark?"

I know, because I was one of these people (and sometimes, before I catch myself, I still am).

But text is a user interface. The writer creates this user interface, and the reader uses it. So, the basic rules of usability apply: make sure the user can figure out what the hell is going on as easily and quickly as possible.

That literally means that you need to kill your sacred cows (and by "literally", I mean "figuratively, but emphatically," just like pretty much everybody else who uses the term "literally"). It doesn't matter whether something "isn't a word" by somebody's definition of what words are. It doesn't matter whether a sentence ends in a preposition. It doesn't matter whether a famous little book listing a number of its authors' pet peeves contains a vaguely explained "rule" disallowing a particular grammatical construct with some poorly constructed examples that don't even illustrate the rule they're meant to support.

What matters is that it works. Is your text devoid of ambiguity, complex words, and long sentences? Do people quickly understand what you're writing? Can they get through your text easily, without getting tired?

Language lives. If people understand what you're writing, if it's in common use, if it's a good choice because it's easy to read and figure out, use it. In language as in all user interface design, go with whatever works.

Backstage

Not only had I agreed to dress up as a woman for a company party, I was being a good sport about it. And it wasn't just any game of dress-up, either. I was going to wear a lot of makeup, a wig, a stuffed bra, a sexy outfit, and I was planning on showing midriff. "I can't believe you're letting us do this!" six of my co-workers (filling in as my hair and makeup crew) kept saying. But to me it made sense.

This was right around the time I had pulled the plug on my design blog because it got too popular. In explaining why the site was going away, I used certain words like "introvert", "overwhelmed", and "privacy". I had gotten an Internet reputation as an anti-social recluse even as I was getting some industry attention for delivering highly energetic and passionate talks at SxSW, the art school where I teach, and the design studio where I work.

It doesn't make sense on paper, but as they put on my hair and makeup, I was right where I wanted to be. I smiled with my eyes closed, the adhesive from fake lashes wetly weighing down my eyelids. I'm a performer. Point me towards the stage and watch me go.

"Look how relaxed he is!" my makeup artists exclaimed. "It's common for performers," I heard a man's voice say. "Low-key and calm backstage. He's saving up his energy for the performance that comes later."

"Mmm-hmmm", I murmured, my eyes still closed. Exactly. An hour later, I was part of a parade, then part of an all-afternoon competition. I posed for hundreds of silly pictures, many of me kissing co-workers. I was part of a dance flash mob. I leapt into men's arms. I kept re-applying my lipstick and fixing my bra. I was on all afternoon. I enjoyed helping everyone have a good time.

Then I was done. I slipped away from the party a few hours early and climbed onto a waiting bus. Alone. I reveled in the quiet as I watched across the soccer field to the revelry I had just escaped. I opened my sketchbook and drew up some new design ideas. I jotted down a few ideas for essays, including this one. I lay my head against the cool window and fell asleep.

I spoke at an AIGA event last year. Backstage I was quiet and alone, because I had to be. Then I bounded onto the stage and did my best to do a good job. Then a few side conversations with students, a few good-natured design debates with other speakers, and I slipped out. I didn't watch anyone else or stay for drinks afterwards.

The same was true for my first SxSW talk. I flew in, stayed in my hotel, woke up late, saw a movie, gave my talk, had dinner, went back to my hotel, then flew out the next afternoon after visiting with an old friend, far from the conference. I barely engaged with anyone. I couldn't.

And of course, the blog. I worked really hard for a month to write as well as I could. And I knew if I left the blog open to comments or interacted with people over Twitter, it would distract me. And I knew from experience that distractions would mean the work would suffer.

Some performers are on all the time (though it's not as common as many believe), but that's not how I work. I need time to think, to train and practice, hone and refine. I need to do my hair and makeup away from the crowds. And when I'm ready, I'll go onstage and put everything into the performance.

When I do my job right, I can perfectly disappear into the role. And when that happens, you won't find me backstage. I'll be long gone.

Where We All Sound the Same

Dear designer,

We need to talk.

I'm sure the work on your portfolio site is lovely, but your copy makes a bad first impression. No, there's not a typo. It's worse than that. Your problem is that you sound boring. It's all there in the first line:

"Hi, my name is [name] and I live in [city]."

Yawn. Listen, it’s not that the words are bad by themselves. They're casual, friendly, inviting, to-the-point. The problem is, almost every other designer is saying the same thing. And if you were a developer, or a lawyer, or radio host, that would be fine. But you're a designer, damn it. You invent the future. You do things better. Right?

If I'm visiting your site, I'm trying to determine if you're a great designer. Meaning I'm looking for a great communicator. If your intro line is a throwaway, I'm going to wonder where else your communication skills need work. It's gotten to the point where if someone says "yo" instead of "hi", I'm more likely to remember their site because it's at least a little different than everyone else's. The differentiation bar has fallen that low.

Not to say you didn't pour your heart into the site, or that there isn't lots of goodness it it. You picked just the right typeface, just the right color scheme, and maybe your site uses responsive design so it looks great on any device. You've clearly worked hard on this, so why lead with a line as stale and overused as a "why did the chicken cross the road" joke?

If you’re a designer, you're a professional communicator. This is your portfolio site, and you can do better.

Investment Creates Pride

(Lukas Mathis)

It's no secret that people are proud of the things they make on their own. The more you've invested in something, the prouder you are. I love the feeling of holding something I've built in my own hands; the more work I put into it, the better. I love my own sketches and drawings, even if, objectively, most aren't that great. I think the food I cook on my own tastes much better than food from a restaurant.

This doesn't just apply to physical things. It also applies to computer work. I love writing code and letting it run, the feeling of having created something that works. I love drawing in ArtRage, creating things in OmniGraffle or Sketch, even writing text in BBEdit.

I'm much less fond of Pages, though. When I start Pages and create a new document, it asks me to pick from a template. Usually, I just pick one, and change the text to whatever I need. I'm not proud of that. I don't own the result, I'm just riding somebody else's coattails. True, I'm making something, but I'm not making it mine. It'll never be mine.

I love tweaking a photo I've shot in iPhoto, messing with the sliders until it looks good. Conversely, I don't like the photo apps that allow you to pick a predefined filter. I didn't make that filter. It's not because of me that the photo looks good. Somebody else put in the work, I just clicked on a button.

There's an oft-repeated anecdote about cake mixes. Supposedly, they didn't sell well until one manufacturer decided to allow people to add their own eggs, thus giving them the feeling of "owning" the resulting cake, rather than just making somebody else's cake. The anecdote is false, but the underlying sentiment is real. It's not your cake if you're just putting it into the oven.

GM offers buyers of certain Corvettes the option of hand-building the engine that powers their new car. Buying a car is already an emotional experience, but imagine how much more invested in your car you become when you yourself have hand-built its engine!

When designing creative apps, the line between giving people so much rope that they can't help but hang themselves, and giving them so little that they can't even tie a knot in it, is often very fine. Most Corvette owners probably couldn't build their cars from scratch, but they are perfectly capable of putting together an engine with some help from a Corvette employee.

When working on an application, think about this. Are you letting your users own the things they create, or are they just following in somebody else's footsteps? What's your application's hand-built Corvette engine?

What You're Willing To Leave Behind

One thing I learned in a year in real estate is everyone wants it all: enough rooms, a yard, beautiful views, great schools, room to expand, not a lot of renovations required, near stores but not too near, next to a park, everything up to code, and all for a price they can afford.

But no one gets it all, of course. So it becomes a question of the one thing you're willing to budge on. Higher crime? Worse schools? A hefty price tag? Fewer rooms than you were hoping for? It's different for everyone.

When my wife and I were home shopping, we wanted a great house, in a great location, for a price we could afford. Like everyone else. So we wrote down what our highest priorities were, and what we were willing to lose in order to get them.

It worked. Our exercise made it very clear which houses fit into our criteria and which didn't, which helped us streamline our process and feel good about the decision we made.

This is a technique I've learned to rely on when designing. There is no such thing as a "perfect design", only tradeoffs. Identify the experience you're working towards and do everything you can get to get there. But don't fool yourself into thinking you'll reach success by refusing to compromise. Deft compromise is precisely what makes a design strong.

Designing with Placeholders

(Lukas Mathis)

One of the first projects I worked on was a kind of social network for students (yep, that was way before Facebook). We spent quite a bit of time coming up with a good design. We were proud of ourselves. It was sleek and modern and simple and flat, just the way we liked it in 1998 (and, apparently, also in 2013).

Proudly, we launched it. It actually grew very popular surprisingly quickly. Unfortunately people didn't quite behave the way we intended them to. For example, there was an area for classified ads. When designing the site, we created our own classifieds to test how the layout would look. We didn't account for the kind of strange ads actual humans would come up with though.

People filled their ads with ASCII art. They wrote in all caps or without any caps at all. They wrote insanely long descriptions of the bike they wanted to sell, but forgot to add any paragraphs. Most didn't upload images of the thing they wanted to sell (digital cameras and scanners were rare back then) but those who did often uploaded broken images — Windows BMPs with a JPG ending, for example.

As a result, our pretty layout looked horrid most of the time. We just didn't account for human ingenuity when designing it. Our placeholder data didn't measure up to mankind's chaotic nature.

When Facebook launched their new Android home screen, Facebook Home, they showed mockups of it. I hope Facebook Home's designers didn't base their designs on the photos used in those mockups. Most people's Facebook friends aren't stock image models who love to play frisbee, have the cutest kids in the world, and cook meals that look like they're straight out of a fancy restaurant run by the latest celebrity chef. I hope they also made sure that Facebook Home works well if you have friends who upload dozens of images of their cat everyday. Friends who love to post pictures of their drunk buddies. Friends who fill their timeline with selfies. And take photos of their grubby McDonald's burgers.

Don't misguide yourself. Don't use placeholder data when designing. Get as much real-life data generated by real human beings as possible and design for that. If you don't, you're in for a surprise.

Overdoing It

(Lukas Mathis)

Placeholder data was, sadly, not the only mistake we made with that project. Here's another one.

When we first designed it, we basically spent a week in a cottage in the Alps, and brainstormed ideas for what the website would offer.

Then we implemented them.

Yep, all of them.

The website had an area where you could look for people to share a flat with, or, conversely, find people who had rooms they wanted to share. Your flat-sharing community could create profile pages, and, of course, individual users could have profiles, as well (this, by the way, was implemented such that communities and users had different, unconnected logins so the site had two entirely disconnected login systems). There was a forum where people could discuss whatever they wanted. We had a market place where people could buy and sell things (and, of course, we also had an online shop where you could order T-shirts and stuff). We had a section where people could send e-cards (remember, this was in the 90s). We had a section where we published information and editorials about flat sharing.

And that's just the top-level features.

(I actually had to go to the Wayback Machine to look up the site; it had so many different features right at launch that I simply couldn't remember all of them.)

Needless to say, people were terribly confused by all of the stuff on our website, and even though the site became very popular for putting out flat-sharing ads, most of the other features never got any traction. In fact, all of the added clutter probably prevented the site from becoming even more popular.

Nowadays many designers have internalized the concept that the first version of your product should ship with the smallest feature set you can possibly get away with, polished to the highest sheen you can possibly get away with.

But it's not just the first version you have to be careful with. For every new feature, you need to ask yourself if it really improves your product, if you can implement it so perfectly that it doesn't drag the rest of your product down.

For a long time, I've a owned a simple, tiny USB scanner that I scan all my documents with. It works great. You attach it to your computer, insert a page, hit a button, the page is scanned. Then the software OCRs it and sends it to a filing application.

Recently, I "upgraded" my scanner to a wireless one. That seemed like a useful feature. It would get rid of the requirement to connect the scanner to the computer. I could scan documents in one place and they'd magically appear over on my computer on the desk.

Unfortunately there are problems. First of all, scanning now takes a lot longer. Previously, I'd push the button, wait a few seconds, and the document was on my computer. The new scanner technically scans faster than the old one. But since it then has to transmit the scanned document wirelessly, I have to wait for each scan much longer than before.

Even worse, the scanner uses a lot of power. It has an internal battery but that's only enough for about 50 scans at most. And I can't just plug it in and keep scanning because the wireless feature of the scanner actually uses more power than its plug provides.

This is the worst case scenario for a new feature. The new feature itself doesn't work perfectly (it's too slow). Even worse, it's also dragging down existing features (now I can only scan a limited number of documents before the scanner runs out of power).

When adding features to an application, don't get ahead of yourself. For every feature, ask yourself:

  1. Can I make this work as well as the rest of my product?
  2. Will this new feature work 100% of the time?
  3. Will this feature never interfere with how the rest of my product works?

When in doubt, only implement features that tick the box on all three of these points.

(By the way, when I asked about battery life, the scanner's manufacturer sent me a new power plug that puts out enough power to keep the scanner running. I've gone back to using the USB version of the scanner though.)

The Easiest Version

The first 90% of a project is fun and the second 90% will try to kill you. This is why most projects never ship, and even when they do, they can cause chewed fingernails, bankruptcy, late nights, insomnia, relationship problems, laptops thrown in anger, and so on.

But never forget that version one, relatively speaking, is a cakewalk. It's nothing compared to every other version you will attempt to release. Version one is the high point. It is the closest you will get to the design you first envisioned.

I know, I know. Everyone thinks version one is just the beginning. But most software isn't seriously supported after a year, let alone three. Modern software is a gold rush, and there are more abandoned towns than successful cities. Far more. But we can learn a lot from the software that has managed to make it past the first few versions.

First, of course your version one is clean, simple, and focused. Everyone's is. The question is how well it scales up for the inevitable features you're going to add. That's a much tougher challenge than shipping a focused v1, and where truly great designers make their mark.

Second, do you know what your product stands for? Do you have a point of view regarding what new features you'll take on? If not, you'll end up with an "I guess I'll play it by ear" strategy and it's likely your software won't age gracefully. Figure out what you want your product to be the best at, and don't be afraid to decline to work on anything that distracts from that overall goal.

Third, you need to have a plan for supporting people that are using your product. This isn't as fun as whiteboarding new ideas, but it's just as much a part of your product's design as the icon, the copy, the typeface, the color scheme, or the flows. Software design is mostly thinking about how to deal with problems that arise, and customer support is the purest example. Don't overlook it.

So yes, congratulations on version one! It's clean and pretty, and perfectly suits the needs of your target audience. Pat yourself on the back, but know it'll age and bloat faster than you think. Version one is a challenge and an achievement, but the best designers can shepherd their vision to version three and still love what they see.

Sorting Through the Noise

We start with too little, move to too much, and then we need to find the middle.

Older generations grew up with a small handful of television stations to choose from. It was a time of a shared broadcast experience, where you could assume your co-workers were watching the same thing as you the night before.

Then cable came, and brought with it an explosion of new channels. The "couch potato" was born. Bruce Springsteen wrote "57 Channels (And Nothin' On)", which sounded quaint a few years later when we became used to packages with hundreds of channels. And then thousands.

Eventually, we saw the shift from passive "channel surfing" to proactive "time shifted entertainment". Innovations like TiVo meant you no longer had to be home by 8pm to catch an episode live. It was easier than ever to "catch up" to a show everyone else was talking about at work.

Repeating History

With television, we went from experiencing the same thing in a limited way, to an explosion of mostly low-quality content, to a more targeted, curated experience. Now I watch the shows I want to follow, and only them. I haven't channel surfed in years. I doubt I ever will again.

I see a similar story in the way we use our phones. We started on basic and limited devices that no one really loved. Then came the shift to smartphones and our platform and app choices exploded. We all agree that smartphones are better (though more expensive) than the feature phones they displaced. We love the choice of having millions of apps to choose from.

But the data tells us that most people use apps a few times, then don't touch them again. Many home screens show endless screens of apps that the owner hasn't used in months or years. That's not a strong ecosystem. That's an app graveyard.

So what's the smartphone version of TiVo? What's this generation's version of focusing only on the areas that matter? It's not sustainable to support a million app developers when fewer than 1% of them have a significant user base.

Which is not to say we should stop making apps. Even in the worst of the cable years, there were always great shows popping up, and that'll be no different with apps. It's a numbers game: more apps, more experimentation, more hits.

But there came a point where people stopped channel surfing and started guarding their TV time more closely. That moment will arrive in the app space. It will transform how we use our phones, and we'll all be better for it.

Mobile Has Reached Peak Delight

[This essay was written before iOS7 was announced. Several of the complaints in this essay were addressed, which was nice to see. –Ed.]

I have no doubt that the mobile space will continue to be very profitable for the foreseeable future. There's clearly a lot of growth left in it.

But one of the biggest mistakes people make is over-emphasizing tangibles like financial growth and devaluing intangible but significant factors like happiness, love, and satisfaction. So while I believe that mobile will continue to drive profits in the technology industry, I believe delight on mobile has peaked. I think it's only downhill from here until the next big thing.

Nailing the basics

We've gotten used to the idea that phones will let us communicate with everyone in the world via any shared protocol. Phone, email, text, and web are built-in, and 3rd party apps like Facebook, Twitter, and WhatsApp are available on every platform. These are the basics of communication, and we've gotten them pretty well nailed.

Searching for the Next Commercial

All the big phone and app makers are searching for the next "wow" idea so they can market it within an inch of its life. But the basics have been addressed, so what's left is a bunch of space age ideas that aren't yet palatable to the mainstream.

Voice-powered personal assistants, wearable computing, and the ability to blast content from one screen to another are all important improvements. But the mainstream doesn't yet have a reason to care. Their basics are covered, and these geek technologies looking for a reason to be marketed smell like desperation.

While we look for the Next Big Thing, treating mobile like the tech equivalent of the season-driven fashion industry, there are a tremendous number of problems in the mobile space that are eroding delight more every year.

Problem #1: Updates

Software makers are very proud when they ship product on time and on a schedule. Doubly so if they're "agile". Frequent software releases are held up as a common sense feature that customers love.

Wrong.

Look, Mr. Software Maker, I know you addressed a crasher bug with unicode characters in landscape mode, or changed your settings icon to be less pixelated, or whatever. But no one cares beyond a tiny minority of superfans.

And it's worse than that. It's not just that people don't care, you're actively causing them stress each time you push an update. Seeing an App Store icon with a badge that says "76" is not fun, nor delightful. Increasingly we're seeing people simply refuse to update.

One solution is defaulting to auto-installed app updates. If you're on a Wi-Fi connection and plugged into power (which many people are when they sleep), the phone should install the latest and greatest software for you.

Are there drawbacks to this? Should overriding be possible? Should there be a way to hold back software for three days in the case of a release going out with a bug? Yes, yes, and yes.

But these are small tweaks that will going a long way towards fixing a broken system. Updates are a chore, and need to be fixed.

Problem #2: SMS Spam

2007 was a magical time when SMS spam wasn't a problem and email spam had started to be less of a problem. As a result, the ability for a mobile device to block spam wasn't a critical feature. And a good thing, too, because it's a multi-step process that's hard to design well on a small screen.

Well, get ready. SMS spam is on the rise and it's going to get worse before it gets better. Probably a lot worse. Until smartphones come built-in with strong SMS blocking, we're going to suffer through a lot of frustrating SMS spam that makes our phones less delightful.

Problem #3: Notification Spam

Notifications have gotten ridiculous. Some notifications are highly important, like receiving a timely text from a good friend. Others, like a game informing you that your character can now afford to buy a new item, are not. But until we come up with a better system for notifications, it'll be like the boy who cried wolf. We'll learn to tune them out, and will miss the important ones.

I think there should be a tiered system. By default, trusted apps (such as the built-in text messaging app) should be able to notify aggressively. Every other app, such as 3rd party games, should require you to opt-in to the types of notifications you want to see. And not in a blanket "yes/no" way. The app should be forced to prove why every different kind of update it wants to send is worth turning on.

Problem #4: Greedy Defaults

Software makers know that people don't change their defaults very often. So if you can choose between a polite setting that makes you less money or a rude one that makes you more money, there's a big incentive to go with the rude money-making one. If you add up all the user-hostile defaults on mobile, it adds up to an experience that makes mobile less delightful.

Problem #5: Website spam

As everyone races to release apps, we're increasingly seeing the "download our app!" banners on websites we're trying to visit. This is making the browsing experience worse, and to little benefit. When someone sends you a link to an article, it's becoming common to have to click through the "no, just take me to the content" dismiss links, often in tiny type and hidden in the corner of the page.

Problem #6: Spam That Claims It's Not Spam

Somewhere along the way, it became popular to say "we promise not to spam you" on email forms. But little changed. Now almost every site you visit, every product you buy, and every beta you sign up for believes it's your best friend. "We've been hard at work!", the messages chirp. Meanwhile, I don't even remember who the company is, and I certainly don't care that they just added some incremental new feature.

I think back to their promise not to spam me. I wonder how they'd categorize these relentless status emails. One email from one overzealous company is easy to ignore. It's the aggregate I get every day that weighs me down, drowns out the messages I care about, and makes sifting though email a chore. And they still don't admit that any of it is spam, they think it's "spreading the word about an exciting new feature".

We Know How This Movie Ends

In many ways, the decreasing delight in the mobile space mirrors what happened in the PC industry. After the novelty of personal computers and the internet wore off, we were left with these amazing machines that were a pain to use. They broke a lot. They crashed and lost our data. They had viruses on them. But we still loved them. It felt like they could do anything, if only we could make them work right.

This is what was so amazing about mobile, for a time. It had none of the window management, viruses, complex configurations, strange incompatibilities, or other problems that plagued the personal computer. The difference between PC and phone was pretty freeing. For a while.

It's All Too Much

Things have changed. Mobile is much more reliable, but it's repeating many of the mistakes that make the desktop computer such a cognitive and emotional burden. But this time, the device isn't off in a side office somewhere, it's in our pockets, in our hands, and woven into our lives. This makes its PC-like shortcomings much harder to avoid than in the PC era.

The industry's drive towards providing more features just inflames this problem. If I could choose between a phone that was more respectful of my time or had more features, I'd choose the respectful one. But that's not an option, so my $20 basic feature phone is looking more appealing by the day.

I think we're entering what will be an interesting era in mobile. And if it's as frustrating as it looks like it may be, it'll be ripe for disruption. Maybe sooner than we think.

Magical User Interfaces

(Lukas Mathis)

Apple's ads portray Siri as a kind of person. Samuel L. Jackson talks to it as if it were a human being, and it responds in kind. That's magical.

My Android phone's calendar is magical, too. It allows me to enter appointments as natural text. I can type "meet Jon at 5pm" (which I usually don't, because meeting Jon would involve a day of sitting on a plane). My phone parses that text and creates the appointment.

Neither of these two user interfaces work well. Apple wants you to think of Siri as a magical anthropomorphic being that lives inside your iPhone, not as a bunch of algorithms. Siri's user interface is patterned after human conversation. You say something, Siri thinks, Siri answers. When it doesn't work, the only thing you can do is randomly guess what went wrong and try again.

Likewise, my phone often doesn't get what I intended to say, so the next step after entering an appointment is looking through the calendar to see if I can actually find the appointment I just created.

With both user interfaces, my action and my phone's response are disconnected. The phone is trying to appear magical. It's guessing what I'm trying to do. It's not showing me what's really happening — that would be revealing the magic trick.

But it's better to expose some of the underlying application logic to the user than to appear magical. "Magical" is usually not a good attribute for a user interface. "Magical" often means "obtuse and hard to learn". When you're designing user interfaces, you want things to be predictable and repeatable, not magical.

Of Course

When people try to design magical interfaces, they’re often aspiring for the “wow” moment. But that’s the wrong focus. Designers should instead be focusing on “of course” moments, as in “of course it works like that.” Most product design should be so obvious it elicits no response.

The problem with aiming for “wow” is when you _try _to innovate, your design process encourages novel new interactions. But by definition, novel and innovative are often not familiar or intuitive, which can easily tip over into “hard to use.”

On the other hand, the Nest thermostat, the iPod click-wheel, iOS’s pinch-to-zoom, the Wii controller, the first Google Maps, pull-to-refresh, and many other recent design success stories have a very strong “of course!” sense to them. Think back to using those products the first time. You probably didn’t struggle with them, which is why they succeeded despite employing unfamiliar interactions.

Great design does what it is supposed to with minimal fuss, and without drawing attention to how clever it is. That’s “of course” design. It’s wickedly difficult to pull off, it doesn’t garner headlines outside the design world, but it’s all your customer really wants.

Designers going for “wow” are too often leading with their desire to be recognized for an ability to innovate. Designers going for “of course” are earnestly attempting to fade their design into the background. They want their solutions to feel like they always existed, like there could be no other option.

Forget “wow”. You can’t conjure it. Instead, work to incorporate “of course” to your product. Your users will love the result, even if they can’t put their finger on why.

Brand is Misunderstood

A few weeks into designing an app, my CEO asked why the company logo wasn't more visible. Branding on the website, the app logo, and the loading screen all made sense to me. But putting it in the chrome of the app itself seemed odd, so I asked for clarification.

He liked the idea that when using the application, the company logo was there, presumably logging eyeball time. I said "I'm not sure that's a reason to clutter up the interface with a marketing element."

"That's what branding is," he responded. And that's when I realized our problem. We both wanted the start-up to succeed. We both believed that a strong brand would strengthen our chances for survival. But we understood brand to mean contradictory things.

He believed that making the logo more prominent helped the brand. I believed that brand was nothing more than the reputation of a product. He believed that missing a chance to add the logo weakened the brand. I believe that distracting a user with too many marketing elements can harm the brand.

I pointed out that on Windows, the app icon is already in the window of the application, so his plan would lead to two company logos on the screen. The argument didn't work. I'm pretty sure he would have been fine with three logos. Because that's how he understood brand. For many people, brand means "put logos everywhere."

Braiding

While it's true that creativity and inspiration can be hard to explain, quantify, or reliably trigger, there are some things we do know.

First, creativity requires headspace. Think of your favorite novel, painting, movie, or album. None of them were made in a noisy bar, at a whiteboard, during a road-trip, watching TV, or hitting reload on Facebook. They required hours of effort that from the outside looks a lot more like studying for a test than the backslapping and noisy mental image we get when we think of collaboration.

Second, no one is an island, especially not when designing complex product experiences. Yes, there is a time to put your head down and produce. But getting to that point often requires collaboration, iteration, and a ton of dead-ends. Even writers, famously solitary, still run ideas by other people.

I think the latter is well understood. Steve Jobs once compared product design to a rock tumbler, where ordinary ideas get polished and refined through a long process of everything colliding and bumping into each other. And it's why "open studios" are all the rage. Designers love open space! They need to collaborate! The more white board discussion, the better! Right?

Well, sort of. We may have over-indexed on collaboration. We're not giving ourselves nearly enough time to think. To produce. To take all the wonderful conversations we're absorbing at the studio, on Twitter, from inspirational design sites, blog posts, and the like and actually do anything with them.

On my teams, I implement something called "braiding", or scheduled time apart, followed by scheduled time together, back and forth, repeating throughout the project. The idea is that the team brainstorm is made better by people working on their own first. And then the brainstorm gives us some interesting ideas to go push forward on our own.

It's doubly useful if you have someone who can prototype, because you can go from crazy whiteboard ideas, to someone throwing together a rough prototype alone at their desk, to everyone seeing the working example, sometimes all within the same day. The key is knowing when it's time to stop talking and go build something, which some teams struggle with.

Most builders I know do their most productive work at home, away from the office. Or they barricade themselves in a conference room with their headphones on. When they do, they're done talking. They're at a place where they just need to produce. It doesn't make them anti-social, or bad team players, it's just a realization that the modern office is pretty bad at actually producing work. It's designed for talking about it.

We need to stop worshipping multi-tasking, collaboration, and open offices so we can find ways to give people their headspace back. Find opportunities during the process to get some time to work through ideas, and you'll find you'll be much more insightful and thoughtful in meetings. That's braiding, the best technique I know for creating high quality content in an environment that expects you to be in meetings all day.

Things Regular People Do

(Lukas Mathis)

When you're working in tech, it's easy to lose perspective of how strangely most of your friends behave compared to the average person. They care about Dropbox-compatible iOS text editors, they worry about Markdown dialects, and they use Gmail-only email clients.

It's important to keep in mind that most people are not like your friends. Regular people don't buy music on Amazon, convert the MP3s to smaller AAC files, and synchronize them to their iPhones. In fact, they probably never synchronize their iPhones, and just browse YouTube to listen to music. Also, after they've taken so many pictures and movies that their phone's memory is full, it's effectively broken to them, with no obvious way of fixing it.

Regular people don't manage their applications. They click on the download link, and then just let the app sit inside the Downloads folder. Can't find the app anymore? Just download it again.

Regular people don't manage their files. They just save them wherever the Save dialog first suggests, and then access them via the "Recently opened" menu, or by searching.

Regular people use Microsoft Word, rather than a plain-text editor. And Microsoft Word, rather than Photoshop. And Microsoft Word, rather than a journaling application. If you ask them to send you a screenshot, they paste it into Word document (because that's the program they know best) and send the Word document by email, and when the file gets too large for that, they don't send it at all.

How to Design

When designing applications, it's easy to design them assuming that people actually understand how computers work, what a file system is, how to move data between different computers. After all, your friends know that stuff; doesn't everybody?

Meanwhile, in the rest of the world, most people don't know these things, and shouldn't have to. As computers become more appliance-like, the inner workings of these devices doesn't just become less important to users, it also becomes harder to learn. And why should people learn about them? Your users are doctors and car mechanics and carpenters. They're experts in their own fields. They're not computer experts.

What to Design

There's a bigger issue than how to design for regular people, though. Look at the kinds of products people in the tech industry love to create. Todo lists and Dropbox-compatible plaintext editors are probably at the top of the list; bonus points if it's a Dropbox-compatible plaintext-based todo list. Meanwhile, people outside of the tech industry couldn't possibly care less about this stuff if they tried (which they don't, because they don't care).

Here's another example. People in the tech industry have been "solving" the problem of setting up meetings pretty much since networked computers were first invented. But they created it for themselves. Setting up meetings requires everybody to use the same calendaring solution. It requires invitations and time slots and reminders. People don't care about that.

What's the first thing normal people do when they set up a meeting? They try to find a date that works for everybody. How does that work in, say, Exchange? Usually, somebody sets up a meeting, and then half of the invited participants complain because the date doesn't work for them. This is then followed by a few days of hashing out dates, until eventually another invitation is sent out, at which point the whole thing begins anew.

It took until 2007 for somebody to create a solution to this problem that works for regular people. Technically, Doodle.com is so simple that most techies will hesitate to even call it a product. The person who wants to invite others to a meeting sets up a new page on Doodle.com and enters a few dates. Then, participants open that page, enter their name (no accounts needed), and check all of the dates that apply. Eventually, the creator picks the date that works for the most people.

Most programmers could implement this in a few hours. But they didn't. People working in the tech industry create solutions that solve their own problems, that work for themselves and for other people who work in that industry. Not ones that fit the way most other people behave, or that solve their problems.

Designing products that non-experts can use and want to use requires knowing about what the heck they do with their computers. It requires empathy for people who, in many ways, are very different from us. The first step towards a better understanding of how people who aren't like you use computers? Say "yes" the next time some distant relative calls you and says "you know about computers, right? I have this problem, maybe you can help?"

Be Bold

(Lukas Mathis)

Recently, a friend of mine noted that tech user interfaces often innovate for innovation's sake, not because the new user interface is inherently better. "Cars don't replace steering wheels with gestural systems", he said.

That got me thinking. Indeed, the car's user interface hasn't changed since the 18th century. But honestly, isn't the car's user interface pretty terrible?

Originally, cars had steering wheels because of mechanical requirements. Today, these requirements aren't really there anymore. Car manufacturers could put anything they wanted into their cars. Joysticks, perhaps, or W-shaped control yokes, like the ones used in planes.

I don't think we still use steering wheels because they're inherently awesome user interfaces for controlling cars. I think we use them for some other reasons.

Now, some of these points apply to some technologies. If you're writing banking software, for example, you have to put up with tons of regulations. If you're trying to write a new operating system from scratch, you're entering a cost-intensive market. If you're trying to replace the qwerty keyboard layout or the dials on an SLR with something better, people will resist it because of the upfront investment — the Dvorak layout may be better, but who wants to invest all of that time into learning it when the layout they already know works well enough? Worse, if they learn a new layout, they'll only be able to use it on their personal computer. Everything else will still have a qwerty keyboard.

In most situations though, you're not limited by these things. So why stick to the steering wheel when you can try something new? Be bold. Try new things.

Limitations

In art school, I was assigned a project meant to get me thinking: "design a webpage as if there are no technical restrictions." Instead of being inspired, I was puzzled. The more I thought about it, the more frustrated I got.

This was the late 90s, where nothing worked right. CSS was barely used, different browsers rendered pages very differently, and of course bandwidth was maddeningly slow. The assignment was meant to help us look into the future. But I wanted to learn how to be better at designing for the challenges we were dealing with every day.

For example: how could we compress images so they downloaded faster? Was there a way to write webpages that looked the same in Netscape and Internet Explorer? What techniques could we use to convert the awful mess that was early internet design into something as beautiful and mature as print design?

The best way to see the future is to invent it. And the best way to invent it is to improve on the present. Yet here we were playing at the future by pretending the present didn't exist.

This is why I love building products. Inherent in a designer's job description is that we'll see what's wrong, but don't stop there. We try to improve things, and we strive to push things forward. Not by fantasizing about how great it will be someday, but by rolling up our sleeves and getting there. By working at it, we build the future.

I'm concerned that our hand-wavy future worship has, for many, supplanted truly human-centric design. I'm concerned that some genuinely believe that only through stripping away compromise and ignoring realities will we find our creativity and make real progress.

That's exactly wrong. The future will have more limitations, nuances, and caveats than today, and will challenge us in ways we don't yet realize. The best designers of the next generation won't have time to dream about the the distant future. They'll be too busy building it by addressing the problems around them.

Design is how it works

(Lukas Mathis)

There's still this idea that you can write code first, and design later. Lots of companies work like that. Programmers implement something and put the most basic UI on top of it, then designers swoop in to kiss it and make it all better.

That doesn't work. Here's an example. Let's say we're designing an email client's undo feature.

If the user types something and hits undo, you don't want to undo his input character by character. But you also don't want to remove everything she's written. So you need to store checkpoints while the user is typing, preferably when she's taking a break. If she doesn't take any breaks, you still want to add checkpoints every sentence or so.

Let's say the user deletes a message. This command may be sent to the server, where the message is deleted or moved to a Trash bin. If the user undoes the action, the action also needs to be undone on the server, which may involve various different actions.

Finally, let's say the user sends a message, but then decides that it was a terrible idea to quit that job after all. Undo! Undo! Can you undo sending an email? In some systems (Gmail, for example), yes, for a limited period of time.

These examples don't just involve some UI sugar on top of existing code. They influence the system's architecture. If programmers didn't think of these cases before writing the backend, implementing them could involve much rewriting of code. Nobody wants that.

Design is how it works. Design behavior before writing code.

We Are All Builders

Interaction design, visual design, and code form the three legged stool of software design. You need all three for balance, they're each equally important, and they're all different from each other by the same degree. Visual design and interaction design are just as far apart as code and graphics.

A strong understanding of color theory doesn't mean you know how to write Javascript, and it doesn't mean you intuitively will know how to structure a great flow. Having an impeccable eye for fashion doesn't mean you understand interior design doesn't mean you're great at lighting doesn't mean you're a musician doesn't mean you're a natural at game theory doesn't mean you can design a delightful first-run experience. Being good at one aspect of design does not necessarily translate to other areas.

Sure, they're all technically done by "designers", but I like the word "builder" more. We're all building products. We're all builders, and we all specialize at different parts of the three legged stool.

Blended Builders

It's not as if these different fields have nothing to do with each other, of course. A builder benefits tremendously from additional knowledge, whether marketing, engineering, biology, psychology, botany, linguistics, or finance. Every new bit of information can and should spin thinking in new ways. A blended background is something professional builders are increasingly seeking out.

Whether it's fantastic visual designers who have learned interaction or interaction designers who have learned to code, great products require teams of people who can blend skills. A few years ago, the joke was that these blended builders were as rare as unicorns. In the near future, builders who can only do one thing will be the rarities. And they'd better be really, really good at their one specialty if they want to thrive.

Blending Takes Work

Which isn't to say it's easy. Print-inspired screens atop frustrating flows are the hallmark of the visual designer who doesn't yet understand software design. Their mistakes are not uncommon, but they're not made by strong interaction designers who understand the limitations of their field. It goes the other way as well - just because someone's created a novel and useful affordance for interacting with content doesn't mean the person can make a page look beautiful, delightful, or inviting. It doesn't mean they understand composition, balance, typography, or style, and it doesn't mean they have a good eye. Blending skills is valuable, but it takes a lot of work.

You Need At Least Two

We're in a field that requires a team to show talent in a lot of different areas, which means the more hats you can wear, the more valuable you are. But don't assume if you're wearing one hat that you're automatically able to wear others just as well, or that other areas matter less. Visual, interaction, and code are all equally important, so pick at least two. And remember that there are people competing for your dream job that are learning all three.

Where There Never Was a Hat

I don't really understand the focus some people have on legacy. If I'm good, I'll want to hear about it while I'm alive. When I'm gone, I won't care anymore. This excerpt from Penn of Penn and Teller hits closer to home:

“Doing beautiful things is its own reward. [...] If you do something that you’re proud of, that someone else understands, that is a thing of beauty that wasn’t there before – you can’t beat that.” He gulps suddenly, like a snake trying to swallow an egg, and when he speaks again his voice has a wobble to it.

“There is that great line in Sunday in the Park with George,” he says, referring to Stephen Sondheim’s 1984 musical about Georges Seurat, “‘Look, I made a hat where there never was a hat’.” He falls silent again and, as unexpectedly as those coins turn to fish, big fat tears start rolling down his cheeks.

“I can’t say that line without choking up, because it states, in profoundly poetic terms, what I have always wanted to do with my life. It’s so simple and so funny, but boy it hits me deep.”

I don't expect you to remember this book for long. Most writing is lucky to be remembered the day after it’s read. But I'm proud that Lukas and I made a hat where previously there was none. It's what I've always wanted to do with my life.

So, what about you? What is your next hat?